Copyright © 2015 Bert N. Langford (Images may be subject to copyright. Please send feedback)
Welcome to Our Generation USA!
Social Equality
addresses the successes and failures of achieving complete Equality between Americans, regardless of politics, race, gender or creed, age, economic circumstances, religion, disabilities, and including all minorities
See also:
Humanitarians
Feminism
Great Americans
Culture
Law & Order
Politics
Worst of Humanity
Social Equality vs. Social Inequality including a List of Countries by Income Equality and by Distribution of Wealth
- Click here for a List of Countries by Income Equality
- Click here for a List of Countries by distribution of wealth
Social equality is a state of affairs in which all people within a specific society or isolated group have the same status in certain respects, including civil rights, freedom of speech, property rights and equal access to certain social goods and social services.
However, it also includes concepts of health equality, economic equality and other social securities. It also includes equal opportunities and obligations, and so involves the whole of society.
Social equality requires the absence of legally enforced social class or caste boundaries and the absence of discrimination motivated by an inalienable part of a person's identity. For example, sex, gender, race, age, sexual orientation, origin, caste or class, income or property, language, religion, convictions, opinions, health or disability must absolutely not result in unequal treatment under the law and should not reduce opportunities unjustifiably.
Equal opportunities is interpreted as being judged by ability, which is compatible with a free-market economy. Relevant problems are horizontal inequality − the inequality of two persons of same origin and ability and differing opportunities given to individuals − such as in (education) or by inherited capital.
Concepts of social equality may vary per philosophy and individual and other than egalitarianism it does not necessarily require all social inequalities to be eliminated by artificial means but instead often recognizes and respects natural differences between people.
The standard of equality that states everyone is created equal at birth is called ontological equality. This type of equality can be seen in many different places like the Declaration of Independence.
This early document, which states many of the values of the United States of America, has this idea of equality embedded in it. It clearly states that "all men are created equal, that they are endowed by their Creator with certain unalienable Rights". The statement reflects the philosophy of John Locke and his idea that we are all equal in certain natural rights.
Although this standard of equality is seen in documents as important as the Declaration of Independence, it is "one not often invoked in policy debates these days". However this notion of equality is often used to justify inequalities such as material inequality.
Dalton Conley claims that ontological equality is used to justify material inequality by putting a spotlight on the fact, legitimated by theology, that "the distribution of power and resources here on earth does not matter, because all of us are equally children of God and will have to face our maker upon dying".
Dalton Conley, the author of You May Ask Yourself, claims that ontological equality can also be used to put forth the notion that poverty is virtue. Luciano Floridi, author of a book about information, wrote about what he calls the ontological equality principle. His work on information ethics raises the importance of equality when presenting information. Here is a short sample of his work:
Information ethics is impartial and universal because it brings to ultimate completion the process of enlargement of the concept of what may count as a center of a (no matter how minimal) moral claim, which now includes every instance of being understood informationally, no matter whether physically implemented or not.
In this respect information ethics holds that every entity as an expression of being, has a dignity constituted by its mode of existence and essence (the collection of all the elementary properties that constitute it for what it is), which deserve to be respected (at least in a minimal and overridable sense), and hence place moral claims on the interacting agent and ought to contribute to the constraint and guidance of his ethical decisions and behavior.
Floridi goes onto claim that this "ontological equality principle means that any form of reality (any instance of information/being), simply for the fact of being what it is, enjoys a minimal, initial, over-ridable, equal right to exist and develop in a way which is appropriate to its nature." Values in his claims correlate to those shown in the sociological textbook You May Ask Yourself by Dalton Conley.
The notion of "ontological equality" describes equality by saying everything is equal by nature. Everyone is created equal at birth. Everything has equal right to exist and develop by its nature.
Opportunity:
Main article: Equality of opportunity
Another standard of equality is equality of opportunity, "the idea that everyone has an equal chance to achieve wealth, social prestige, and power because the rules of the game, so to speak, are the same for everyone". This concept can be applied to society by saying that no one has a head start.
This means that, for any social equality issue dealing with wealth, social prestige, power, or any of that sort, the equality of opportunity standard can defend the idea that everyone had the same start.
This views society almost as a game and any of the differences in equality are due to luck and playing the "game" to one's best ability. Conley gives an example of this standard of equality by using a game of Monopoly to describe society.
He claims that "Monopoly follows the rules of equality of opportunity" by explaining that everyone had an equal chance when starting the game and any differences were a result of the luck of the dice roll and the skill of the player to make choices to benefit their wealth.
Comparing this example to society, the standard of equality of opportunity eliminates inequality because the rules of the games in society are still fair and the same for all; therefore making any existing inequalities in society fair.
Lesley A. Jacobs, the author of Pursuing Equal Opportunities: The Theory and Practice of Egalitarian Justice, talks about equality of opportunity and its importance relating to egalitarian justice.
Jacobs states that at the core of equality of opportunity... is the concept that in competitive procedures designed for the allocation of scarce resources and the distribution of the benefits and burdens of social life, those procedures should be governed by criteria that are relevant to the particular goods at stake in the competition and not by irrelevant considerations such as race, religion, class, gender, disability, sexual orientation, ethnicity, or other factors that may hinder some of the competitors’ opportunities at success. (Jacobs, 10).This concept points out factors like race, gender, class etc. that should not be considered when talking about equality through this notion.
Conley also mentions that this standard of equality is at the heart of a bourgeois society, such as a modern capitalist society, or "a society of commerce in which the maximization of profit is the primary business incentive". It was the equal opportunity ideology that civil rights activists adopted in the era of the Civil Rights Movement in the 1960s. This ideology was used by them to argue that Jim Crow laws were incompatible with the standard of equality of opportunity.
Condition:
Main article: Leveling mechanism
Another notion of equality introduced by Conley is equality of condition. Through this framework is the idea that everyone should have an equal starting point. Conley goes back to his example of a game of Monopoly to explain this standard. If the game of four started off with two players both having an advantage of $5,000 dollars to start off with and both already owning hotels and other property while the other two players both did not own any property and both started off with a $5,000 dollar deficit, then from a perspective of the standard of equality of condition, one can argue that the rules of the game "need to be altered in order to compensate for inequalities in the relative starting positions".
From this we form policies in order to even equality which in result bring an efficient way to create fairer competition in society. Here is where social engineering comes into play where we change society in order to give an equality of condition to everyone based on race, gender, class, religion etc. when it is made justifiable that the proponents of the society makes it unfair for them.
Sharon E. Kahn, author of Academic Freedom and the Inclusive University, talks about equality of condition in their work as well and how it correlates to freedom of individuals.
They claim that in order to have individual freedom there needs to be equality of condition "which requires much more than the elimination of legal barriers: it requires the creation of a level playing field that eliminates structural barriers to opportunity".
Kahn's work talks about the academic structure and its problem with equalities and claims that to "ensure equity...we need to recognize that the university structure and its organizational culture have traditionally privileged some and marginalized other; we need to go beyond theoretical concepts of equality by eliminating systemic barriers that hinder the equal participation of members of all groups; we need to create and equality of condition, not merely an equality of opportunity".
"Notions of equity, diversity, and inclusiveness begin with a set of premises about individualism, freedom and rights that take as given the existence of deeply rooted inequalities in social structure," therefore in order to have a culture of the inclusive university, it would have to "be based on values of equity; that is, equality of condition" eliminating all systemic barriers that go against equality.
Outcome:
Main article: Equality of outcome
A fourth standard of equality is equality of outcome, which is "a position that argues each player must end up with the same amount regardless of the fairness". This ideology is predominately a Marxist philosophy that is concerned with equal distribution of power and resources rather than the rules of society. In this standard of equality, the idea is that "everyone contributes to society and to the economy according to what they do best".
Under this notion of equality, Conley states that "nobody will earn more power, prestige, and wealth by working harder".
When defining equality of outcome in education, "the goals should not be the liberal one of equality of access but equality of outcome for the median number of each identifiable non-educationally defined group, i.e. the average women, negro, or proletarian or rural dweller should have the same level of educational attainment as the average male, white, suburbanite".
The outcome and the benefits from equality from education from this notion of equality promotes that all should have the same outcomes and benefits regardless of race, gender, religion etc. The equality of outcome in Hewitt's point of view is supposed to result in "a comparable range of achievements between a specific disadvantaged group – such as an ethnic minority, women, lone parents and the disabled – and society as a whole".
___________________________________________________________________________
Social inequality occurs when resources in a given society are distributed unevenly, typically through norms of allocation, that engender specific patterns along lines of socially defined categories of persons.
It is the differentiation preference of access of social goods in the society brought about by power, religion, kinship, prestige, race, ethnicity, gender, age, sexual orientation, and class.
The social rights include labor market, the source of income, health care, and freedom of speech, education, political representation, and participation. Social inequality linked to economic inequality, usually described on the basis of the unequal distribution of income or wealth, is a frequently studied type of social inequality.
Although the disciplines of economics and sociology generally use different theoretical approaches to examine and explain economic inequality, both fields are actively involved in researching this inequality. However, social and natural resources other than purely economic resources are also unevenly distributed in most societies and may contribute to social status.
Norms of allocation can also affect the distribution of rights and privileges, social power, access to public goods such as education or the judicial system, adequate housing, transportation, credit and financial services such as banking and other social goods and services.
Many societies worldwide claim to be meritocracies—that is, that their societies exclusively distribute resources on the basis of merit. The term "meritocracy" was coined by Michael Young in his 1958 dystopian essay "The Rise of the Meritocracy" to demonstrate the social dysfunctions that he anticipated arising in societies where the elites believe that they are successful entirely on the basis of merit, so the adoption of this term into English without negative connotations is ironic.
Young was concerned that the Tripartite System of education being practiced in the United Kingdom at the time he wrote the essay considered merit to be "intelligence-plus-effort, its possessors ... identified at an early age and selected for appropriate intensive education" and that the "obsession with quantification, test-scoring, and qualifications" it supported would create an educated middle-class elite at the expense of the education of the working class, inevitably resulting in injustice and – eventually – revolution. A modern representation of the sort of "meritocracy" Young feared may be seen in the series 3%.
Although merit matters to some degree in many societies, research shows that the distribution of resources in societies often follows hierarchical social categorizations of persons to a degree too significant to warrant calling these societies "meritocratic", since even exceptional intelligence, talent, or other forms of merit may not be compensatory for the social disadvantages people face.
In many cases, social inequality is linked to racial inequality, ethnic inequality, and gender inequality, as well as other social statuses and these forms can be related to corruption.
The most common metric for comparing social inequality in different nations is the Gini coefficient, which measures the concentration of wealth and income in a nation from 0 (evenly distributed wealth and income) to 1 (one person has all wealth and income). Two nations may have identical Gini coefficients but dramatically different economic (output) and/or quality of life, so the Gini coefficient must be contextualized for meaningful comparisons to be made.
Overview:
Social inequality is found in almost every society. Social inequality is shaped by a range of structural factors, such as geographical location or citizenship status, and are often underpinned by cultural discourses and identities defining, for example, whether the poor are 'deserving' or 'undeserving'.
In simple societies, those that have few social roles and statuses occupied by its members, social inequality may be very low. In tribal societies, for example, a tribal head or chieftain may hold some privileges, use some tools, or wear marks of office to which others do not have access, but the daily life of the chieftain is very much like the daily life of any other tribal member.
Anthropologists identify such highly egalitarian cultures as "kinship-oriented", which appear to value social harmony more than wealth or status. These cultures are contrasted with materially oriented cultures in which status and wealth are prized and competition and conflict are common. Kinship-oriented cultures may actively work to prevent social hierarchies from developing because they believe that could lead to conflict and instability.
In today's world, most of our population lives in more complex than simple societies. As social complexity increases, inequality tends to increase along with a widening gap between the poorest and the most wealthy members of society.
Social inequality can be classified into egalitarian societies, ranked society, and stratified society.
Egalitarian societies are those communities advocating for social equality through equal opportunities and rights, hence no discrimination. People with special skills were not viewed as superior compared to the rest. The leaders do not have the power they only have influence.
The norms and the beliefs the egalitarian society holds are for sharing equally and equal participation. Simply there are no classes. Ranked society mostly is agricultural communities who hierarchically grouped from the chief who is viewed to have a status in the society. In this society, people are clustered regarding status and prestige and not by access to power and resources.
The chief is the most influential person followed by his family and relative, and those further related to him are less ranked. Stratified society is societies which horizontally ranked into the upper class, middle class, and lower class. The classification is regarding wealth, power, and prestige.
The upper class are mostly the leaders and are the most influential in the society. It's possible for a person in the society to move from one stratum to the other. The social status is also hereditable from one generation to the next.
There are five systems or types of social inequality:
- wealth inequality,
- treatment and responsibility inequality,
- political inequality,
- life inequality,
- and membership inequality.
Political inequality is the difference brought about by the ability to access governmental resources which therefore have no civic equality. In treatment and responsibility differences, some people benefit more and can quickly receive more privileges than others.
In working stations, some are given more responsibilities and hence better compensation and more benefits than the rest even when equally qualified. Membership inequality is the number of members in a family, nation or faith. Life inequality is brought about by the disparity of opportunities which, if present, improve a person’s life quality.
Finally, income and wealth inequality is the disparity due to what an individual can earn on a daily basis contributing to their total revenue either monthly or yearly.
The major examples of social inequality include income gap, gender inequality, health care, and social class. In health care, some individuals receive better and more professional care compared to others. They are also expected to pay more for these services. Social class differential comes evident during the public gathering where upper-class people given the best places to seat, the hospitality they receive and the first priorities they receive.
Status in society is of two types which are ascribed characteristics and achieved characteristics. Ascribed characteristics are those present at birth or assigned by others and over which an individual has little or no control. Examples include sex, skin colour, eye shape, place of birth, sexuality, gender identity, parentage and social status of parents.
Achieved characteristics are those which we earn or choose; examples include level of education, marital status, leadership status and other measures of merit. In most societies, an individual's social status is a combination of ascribed and achieved factors.
In some societies, however, only ascribed statuses are considered in determining one's social status and there exists little to no social mobility and, therefore, few paths to more social equality. This type of social inequality is generally referred to as caste inequality.
One's social location in a society's overall structure of social stratification affects and is affected by almost every aspect of social life and one's life chances. The single best predictor of an individual's future social status is the social status into which they were born.
Theoretical approaches to explaining social inequality concentrate on questions about how such social differentiations arise, what types of resources are being allocated (for example, reserves versus resources), what are the roles of human cooperation and conflict in allocating resources, and how do these differing types and forms of inequality affect the overall functioning of a society?
The variables considered most important in explaining inequality and the manner in which those variables combine to produce the inequities and their social consequences in a given society can change across time and place.
In addition to interest in comparing and contrasting social inequality at local and national levels, in the wake of today's globalizing processes, the most interesting question becomes: what does inequality look like on a worldwide scale and what does such global inequality bode for the future? In effect, globalization reduces the distances of time and space, producing a global interaction of cultures and societies and social roles that can increase global inequities.
Inequality and ideology:
Philosophical questions about social ethics and the desirability or inevitability of inequality in human societies have given rise to a spate of ideologies to address such questions. We can broadly classify these ideologies on the basis of whether they justify or legitimize inequality, casting it as desirable or inevitable, or whether they cast equality as desirable and inequality as a feature of society to be reduced or eliminated.
One end of this ideological continuum can be called "Individualist", the other "Collectivist".
In Western societies, there is a long history associated with the idea of individual ownership of property and economic liberalism, the ideological belief in organizing the economy on individualist lines such that the greatest possible number of economic decisions are made by individuals and not by collective institutions or organizations.
Laissez-faire, free market ideologies—including classical liberalism, neoliberalism, and libertarianism—are formed around the idea that social inequality is a "natural" feature of societies, is therefore inevitable and, in some philosophies, even desirable.
Inequality provides for differing goods and services to be offered on the open market, spurs ambition, and provides incentive for industriousness and innovation. At the other end of the continuum, collectivists place little to no trust in "free market" economic systems, noting widespread lack of access among specific groups or classes of individuals to the costs of entry to the market.
Widespread inequalities often lead to conflict and dissatisfaction with the current social order. Such ideologies include Fabianism and socialism. Inequality, in these ideologies, must be reduced, eliminated, or kept under tight control through collective regulation.
Furthermore, in some views inequality is natural but shouldn't affect certain fundamental human needs, human rights and the initial chances given to individuals (e.g. by education) and is out of proportions due to various problematic systemic structures.
Though the above discussion is limited to specific Western ideologies, similar thinking can be found, historically, in differing societies throughout the world. While, in general, eastern societies tend toward collectivism, elements of individualism and free market organization can be found in certain regions and historical eras.
Classic Chinese society in the Han and Tang dynasties, for example, while highly organized into tight hierarchies of horizontal inequality with a distinct power elite also had many elements of free trade among its various regions and subcultures.
Social mobility is the movement along social strata or hierarchies by individuals, ethnic group, or nations. There is a change in literacy, income distribution, education and health status. The movement can be vertical or horizontal.
- Vertical is the upward or downward movement along social strata which occurs due to change of jobs or marriage.
- Horizontal movement along levels that are equally ranked. Intra-generational mobility is a social status change in a generation (single lifetime).
For example, a person moves from a junior staff in an organization to the senior management. The absolute management movement is where a person gains better social status than their parents, and this can be due to improved security, economic development, and better education system. Relative mobility is where some individual are expected to have higher social ranks than their parents.
Today, there is belief held by some that social inequality often creates political conflict and growing consensus that political structures determine the solution for such conflicts. Under this line of thinking, adequately designed social and political institutions are seen as ensuring the smooth functioning of economic markets such that there is political stability, which improves the long-term outlook, enhances labor and capital productivity and so stimulates economic growth.
With higher economic growth, net gains are positive across all levels and political reforms are easier to sustain. This may explain why, over time, in more egalitarian societies fiscal performance is better, stimulating greater accumulation of capital and higher growth.
Inequality and social class:
Main article: Social class
Socioeconomic status (SES) is a combined total measure of a person's work experience and of an individual's or family's economic and social position in relation to others, based on income, education, and occupation. It is often used as synonymous with social class, a set of hierarchical social categories that indicate an individual's or household's relative position in a stratified matrix of social relationships.
Social class is delineated by a number of variables, some of which change across time and place. For Karl Marx, there exist two major social classes with significant inequality between the two. The two are delineated by their relationship to the means of production in a given society. Those two classes are defined as the owners of the means of production and those who sell their labor to the owners of the means of production.
In capitalistic societies, the two classifications represent the opposing social interests of its members, capital gain for the capitalists and good wages for the laborers, creating social conflict.
Max Weber uses social classes to examine wealth and status. For him, social class is strongly associated with prestige and privileges. It may explain social reproduction, the tendency of social classes to remain stable across generations maintaining most of their inequalities as well.
Such inequalities include differences in income, wealth, access to education, pension levels, social status, socioeconomic safety-net. In general, social class can be defined as a large category of similarly ranked people located in a hierarchy and distinguished from other large categories in the hierarchy by such traits as occupation, education, income, and wealth.
In modern Western societies, inequalities are often broadly classified into three major divisions of social class: upper class, middle class, and lower class. Each of these classes can be further subdivided into smaller classes (e.g. "upper middle"). Members of different classes have varied access to financial resources, which affects their placement in the social stratification system.
Class, race, and gender are forms of stratification that bring inequality and determines the difference in allocation of societal rewards. Occupation is the primary determinant of a person class since it affects their lifestyle, opportunities, culture, and kind of people one associates with. Class based families include the lower class who are the poor in the society. They have limited opportunities.
Working class are those people in blue-collar jobs and usually, affects the economic level of a nation. The Middle classes are those who rely mostly on wives' employment and depends on credits from the bank and medical coverage. The upper middle class are professionals who are strong because of economic resources and supportive institutions. Additionally, the upper class usually are the wealthy families who have economic power due to accumulative wealth from families but not and not hard earned income.
Social stratification is the hierarchical arrangement of society about social class, wealth, political influence. A society can be politically stratified based on authority and power, economically stratified based on income level and wealth, occupational stratification about one's occupation. Some roles for examples doctors, engineers, lawyers are highly ranked, and thus they give orders while the rest receive the orders.
There are three systems of social stratification which are the caste system, estates system, and class system. Castes system usually ascribed to children during birth whereby one receives the same stratification as of that of their parents. The caste system has been linked to religion and thus permanent. The stratification may be superior or inferior and thus influences the occupation and the social roles assigned to a person.
Estate system is a state or society where people in this state were required to work on their land to receive some services like military protection. Communities ranked according to the nobility of their lords. The class system is about income inequality and socio-political status.
People can move the classes when they increase their level of income or if they have authority. People are expected to maximize their innate abilities and possessions. Social stratification characteristics include its universal, social, ancient, it’s in diverse forms and also consequential.
The quantitative variables most often used as an indicator of social inequality are income and wealth. In a given society, the distribution of individual or household accumulation of wealth tells us more about variation in well-being than does income, alone. Gross Domestic Product (GDP), especially per capita GDP, is sometimes used to describe economic inequality at the international or global level.
A better measure at that level, however, is the Gini coefficient, a measure of statistical dispersion used to represent the distribution of a specific quantity, such as income or wealth, at a global level, among a nation's residents, or even within a metropolitan area. Other widely used measures of economic inequality are the percentage of people living with under US$1.25 or $2 a day and the share of national income held by the wealthiest 10% of the population, sometimes called "the Palma" measure.
Patterns of inequality:
There are a number of socially defined characteristics of individuals that contribute to social status and, therefore, equality or inequality within a society. When researchers use quantitative variables such as income or wealth to measure inequality, on an examination of the data, patterns are found that indicate these other social variables contribute to income or wealth as intervening variables.
Significant inequalities in income and wealth are found when specific socially defined categories of people are compared. Among the most pervasive of these variables are sex/gender, race, and ethnicity. This is not to say, in societies wherein merit is considered to be the primary factor determining one's place or rank in the social order, that merit has no effect on variations in income or wealth.
It is to say that these other socially defined characteristics can, and often do, intervene in the valuation of merit.
Gender inequality:
Gender as a social inequality is whereby women and men are treated differently due to masculinity and femininity by dividing labor, assigning roles, and responsibilities and allocating social rewards. Sex- and gender-based prejudice and discrimination, called sexism, are major contributing factors to social inequality.
Most societies, even agricultural ones, have some sexual division of labor and gender-based division of labor tends to increase during industrialization. The emphasis on gender inequality is born out of the deepening division in the roles assigned to men and women, particularly in the economic, political and educational spheres. Women are underrepresented in political activities and decision making processes in most states in both the Global North and Global South.
Gender discrimination, especially concerning the lower social status of women, has been a topic of serious discussion not only within academic and activist communities but also by governmental agencies and international bodies such as the United Nations.
These discussions seek to identify and remedy widespread, institutionalized barriers to access for women in their societies. By making use of gender analysis, researchers try to understand the social expectations, responsibilities, resources and priorities of women and men within a specific context, examining the social, economic and environmental factors which influence their roles and decision-making capacity.
By enforcing artificial separations between the social and economic roles of men and women, the lives of women and girls are negatively impacted and this can have the effect of limiting social and economic development.
Cultural ideals about women's work can also affect men whose outward gender expression is considered "feminine" within a given society. Transgender and gender-variant persons may express their gender through their appearance, the statements they make, or official documents they present.
In this context, gender normality, which is understood as the social expectations placed on us when we present particular bodies, produces widespread cultural/institutional devaluations of trans identities, homosexuality and femininity. Trans persons, in particular, have been defined as socially unproductive and disruptive.
A variety of global issues like HIV/AIDS, illiteracy, and poverty are often seen as "women's issues" since women are disproportionately affected. In many countries, women and girls face problems such as lack of access to education, which limit their opportunities to succeed, and further limits their ability to contribute economically to their society.
Women are underrepresented in political activities and decision making processes throughout most of the world. As of 2007, around 20 percent of women were below the $1.25/day international poverty line and 40 percent below the $2/day mark. More than one-quarter of females under the age of 25 were below the $1.25/day international poverty line and about half on less than $2/day.
Women's participation in work has been increasing globally, but women are still faced with wage discrepancies and differences compared to what men earn. This is true globally even in the agricultural and rural sector in developed as well as developing countries.
Structural impediments to women's ability to pursue and advance in their chosen professions often result in a phenomenon known as the glass ceiling, which refers to unseen – and often unacknowledged barriers that prevent minorities and women from rising to the upper rungs of the corporate ladder, regardless of their qualifications or achievements.
This effect can be seen in the corporate and bureaucratic environments of many countries, lowering the chances of women to excel. It prevents women from succeeding and making the maximum use of their potential, which is at a cost for women as well as the society's development. Ensuring that women's rights are protected and endorsed can promote a sense of belonging that motivates women to contribute to their society. Once able to work, women should be titled to the same job security and safe working environments as men.
Until such safeguards are in place, women and girls will continue to experience not only barriers to work and opportunities to earn, but will continue to be the primary victims of discrimination, oppression, and gender-based violence.
Women and persons whose gender identity does not conform to patriarchal beliefs about sex (only male and female) continue to face violence on global domestic, interpersonal, institutional and administrative scales.
While first-wave Liberal Feminist initiatives raised awareness about the lack of fundamental rights and freedoms that women have access to, second-wave feminism (see also Radical Feminism) highlighted the structural forces that underlie gender-based violence. Masculinities are generally constructed so as to subordinate femininities and other expressions of gender that are not heterosexual, assertive and dominant.
Gender sociologist and author, Raewyn Connell, discusses in her 2009 book, Gender, how masculinity is dangerous, heterosexual, violent and authoritative. These structures of masculinity ultimately contribute to the vast amounts of gendered violence, marginalization and suppression that women, queer, transgender, gender variant and gender non-conforming persons face.
Some scholars suggest that women's under-representation in political systems speaks the idea that "formal citizenship does not always imply full social membership". Men, male bodies and expressions of masculinity are linked to ideas about work and citizenship. Others point out that patriarchal states tend top scale and claw back their social policies relative to the disadvantage of women. This process ensures that women encounter resistance into meaningful positions of power in institutions, administrations, and political systems and communities.
Racial and ethnic inequality:
Racial or ethnic inequality is the result of hierarchical social distinctions between racial and ethnic categories within a society and often established based on characteristics such as skin color and other physical characteristics or an individual's place of origin or culture. Racism is whereby some races are more privileged and are allowed to venture into the labor market and are better compensated than others.
Ethnicity is the privilege one enjoys for belonging to a particular ethnic group. Even though race has no biological connection, it has become a socially constructed category capable of restricting or enabling social status.
Racial inequality can also result in diminished opportunities for members of marginalized groups, which in turn can lead to cycles of poverty and political marginalization. Racial and ethnic categories become a minority category in a society.
Minority members in such a society are often subjected to discriminatory actions resulting from majority policies, including assimilation, exclusion, oppression, expulsion, and extermination.
For example, during the run-up to the 2012 federal elections in the United States, legislation in certain "battleground states" that claimed to target voter fraud had the effect of disenfranchising tens of thousands of primarily African American voters.
These types of institutional barriers to full and equal social participation have far-reaching effects within marginalized communities, including reduced economic opportunity and output, reduced educational outcomes and opportunities and reduced levels of overall health.
In the United States, Angela Davis argues that mass incarceration has been a modern tool of the state to impose inequality, repression, and discrimination upon African American and Hispanics. The War on Drugs has been a campaign with disparate effects, ensuring the constant incarceration of poor, vulnerable, and marginalized populations in North America.
Over a million African Americans are incarcerated in the US, many of whom have been convicted of a non-violent drug possession charge.
With the States of Colorado and Washington having legalized the possession of marijuana, drug reformists and anti-war on drugs lobbyists are hopeful that drug issues will be interpreted and dealt with from a healthcare perspective instead of a matter of criminal law.
In Canada, Aboriginal, First Nations, and Indigenous persons represent over a quarter of the federal prison population, even though they only represent 3% of the country's population.
Age inequality:
Age discrimination is defined as the unfair treatment of people with regard to promotions, recruitment, resources, or privileges because of their age. It is also known as ageism: the stereotyping of and discrimination against individuals or groups based upon their age. It is a set of beliefs, attitudes, norms, and values used to justify age-based prejudice, discrimination, and subordination.
One form of ageism is adultism, which is the discrimination against children and people under the legal adult age. An example of an act of adultism might be the policy of a certain establishment, restaurant, or place of business to not allow those under the legal adult age to enter their premises after a certain time or at all.
While some people may benefit or enjoy these practices, some find them offensive and discriminatory. Discrimination against those under the age of 40 however is not illegal under the current U.S. Age Discrimination in Employment Act (ADEA).
As implied in the definitions above, treating people differently based upon their age is not necessarily discrimination. Virtually every society has age-stratification, meaning that the age structure in a society changes as people begin to live longer and the population becomes older.
In most cultures, there are different social role expectations for people of different ages to perform. Every society manages people's ageing by allocating certain roles for different age groups. Age discrimination primarily occurs when age is used as an unfair criterion for allocating more or less resources.
Scholars of age inequality have suggested that certain social organizations favor particular age inequalities. For instance, because of their emphasis on training and maintaining productive citizens, modern capitalist societies may dedicate disproportionate resources to training the young and maintaining the middle-aged worker to the detriment of the elderly and the retired (especially those already disadvantaged by income/wealth inequality).
In modern, technologically advanced societies, there is a tendency for both the young and the old to be relatively disadvantaged. However, more recently, in the United States the tendency is for the young to be most disadvantaged. For example, poverty levels in the U.S. have been decreasing among people aged 65 and older since the early 1970s whereas the number of children under 18 in poverty has steadily risen.
Sometimes, the elderly have had the opportunity to build their wealth throughout their lives, while younger people have the disadvantage of recently entering into or having not yet entered into the economic sphere. The larger contributor to this, however, is the increase in the number of people over 65 receiving Social Security and Medicare benefits in the U.S.
When we compare income distribution among youth across the globe, we find that about half (48.5 percent) of the world's young people are confined to the bottom two income brackets as of 2007. This means that, out of the three billion persons under the age of 24 in the world as of 2007, approximately 1.5 billion were living in situations in which they and their families had access to just nine percent of global income.
Moving up the income distribution ladder, children and youth do not fare much better: more than two-thirds of the world's youth have access to less than 20 percent of global wealth, with 86 percent of all young people living on about one-third of world income. For the just over 400 million youth who are fortunate enough to rank among families or situations at the top of the income distribution, however, opportunities improve greatly with more than 60 percent of global income within their reach.
Although this does not exhaust the scope of age discrimination, in modern societies it is often discussed primarily with regards to the work environment. Indeed, non-participation in the labour force and the unequal access to rewarding jobs means that the elderly and the young are often subject to unfair disadvantages because of their age. On the one hand, the elderly are less likely to be involved in the workforce.
At the same time, old age may or may not put one at a disadvantage in accessing positions of prestige. Old age may benefit one in such positions, but it may also disadvantage one because of negative ageist stereotyping of old people. On the other hand, young people are often disadvantaged from accessing prestigious or relatively rewarding jobs, because of their recent entry to the work force or because they are still completing their education.
Typically, once they enter the labor force or take a part-time job while in school, they start at entry level positions with low level wages. Furthermore, because of their lack of prior work experience, they can also often be forced to take marginal jobs, where they can be taken advantage of by their employers. As a result, many older people have to face obstacles in their lives.
Inequalities in health:
Further information:
- Health equity,
- Inequality in disease,
- Social determinants of health in poverty,
- and Diseases of poverty
Health inequalities can be defined as differences in health status or in the distribution of health determinants between different population groups.
Health care:
Health inequalities are in many cases related to access to health care. In industrialized nations, health inequalities are most prevalent in countries that have not implemented a universal health care system, such as the United States.
Because the US health care system is heavily privatized, access to health care is dependent upon one's economic capital; Health care is not a right, it is a commodity that can be purchased through private insurance companies (or that is sometimes provided through an employer).
The way health care is organized in the U.S. contributes to health inequalities based on gender, socioeconomic status and race/ethnicity. As Wright and Perry assert, "social status differences in health care are a primary mechanism of health inequalities".
In the United States, over 48 million people are without medical care coverage. This means that almost one sixth of the population is without health insurance, mostly people belonging to the lower classes of society.
While universal access to health care may not completely eliminate health inequalities, it has been shown that it greatly reduces them. In this context, privatization gives individuals the 'power' to purchase their own health care (through private health insurance companies), but this leads to social inequality by only allowing people who have economic resources to access health care.
Citizens are seen as consumers who have a 'choice' to buy the best health care they can afford; in alignment with neoliberal ideology, this puts the burden on the individual rather than the government or the community.
In countries that have a universal health care system, health inequalities have been reduced. In Canada, for example, equity in the availability of health services has been improved dramatically through Medicare. People don't have to worry about how they will pay health care, or rely on emergency rooms for care, since health care is provided for the entire population.
However, inequality issues still remain. For example, not everyone has the same level of access to services. Inequalities in health are not, however, only related to access to health care. Even if everyone had the same level of access, inequalities may still remain.
This is because health status is a product of more than just how much medical care people have available to them. While Medicare has equalized access to health care by removing the need for direct payments at the time of services, which improved the health of low status people, inequities in health are still prevalent in Canada. This may be due to the state of the current social system, which bear other types of inequalities such as economic, racial and gender inequality.
A lack of health equity is also evident in the developing world, where the importance of equitable access to healthcare has been cited as crucial to achieving many of the Millennium Development Goals. Health inequalities can vary greatly depending on the country one is looking at. Health equity is needed in order to live a healthier and more sufficient life within society.
Inequalities in health lead to substantial effects, that is burdensome or the entire society. Inequalities in health are often associated with socioeconomic status and access to health care. Health inequities can occur when the distribution of public health services is unequal.
For example, in Indonesia in 1990, only 12% of government spending for health was for services consumed by the poorest 20% of households, while the wealthiest 20% consumed 29% of the government subsidy in the health sector.
Access to health care is heavily influenced by socioeconomic status as well, as wealthier population groups have a higher probability of obtaining care when they need it. A study by Makinen et al. (2000) found that in the majority of developing countries they looked at, there was an upward trend by quantile in health care use for those reporting illness.
Wealthier groups are also more likely to be seen by doctors and to receive medicine.
Food:
There has been considerable research in recent years regarding a phenomenon known as food deserts, in which low access to fresh, healthy food in a neighborhood leads to poor consumer choices and options regarding diet.
It is widely thought that food deserts are significant contributors to the childhood obesity epidemic in the United States and many other countries. This may have significant impacts on the local level as well as in broader contexts, such as in Greece, where the childhood obesity rate has skyrocketed in recent years heavily as a result of the rampant poverty and the resultant lack of access to fresh foods.
Global inequality:
See also: International inequality
The economies of the world have developed unevenly, historically, such that entire geographical regions were left mired in poverty and disease while others began to reduce poverty and disease on a wholesale basis.
This was represented by a type of North–South divide that existed after World War II between First world, more developed, industrialized, wealthy countries and Third world countries, primarily as measured by GDP.
From around 1980, however, through at least 2011, the GDP gap, while still wide, appeared to be closing and, in some more rapidly developing countries, life expectancies began to rise. However, there are numerous limitations of GDP as an economic indicator of social "well-being."
If we look at the Gini coefficient for world income, over time, after World War II the global Gini coefficient sat at just under .45. From around 1959 to 1966, the global Gini increased sharply, to a peak of around .48 in 1966.
After falling and leveling off a couple of times during a period from around 1967 to 1984, the Gini began to climb again in the mid-eighties until reaching a high or around .54 in 2000 then jumped again to around .70 in 2002.
Since the late 1980s, the gap between some regions has markedly narrowed— between Asia and the advanced economies of the West, for example—but huge gaps remain globally.
Overall equality across humanity, considered as individuals, has improved very little. Within the decade between 2003 and 2013, income inequality grew even in traditionally egalitarian countries like Germany, Sweden and Denmark. With a few exceptions—France, Japan, Spain—the top 10 percent of earners in most advanced economies raced ahead, while the bottom 10 percent fell further behind.
By 2013, a tiny elite of multi-billionaires, 85 to be exact, had amassed wealth equivalent to all the wealth owned by the poorest half (3.5 billion) of the world's total population of 7 billion. Country of citizenship (an ascribed status characteristic) explains 60% of variability in global income; citizenship and parental income class (both ascribed status characteristics) combined explain more than 80% of income variability.
Inequality and economic growth:
The concept of economic growth is fundamental in capitalist economies. Productivity must grow as population grows and capital must grow to feed into increased productivity.
Investment of capital leads to returns on investment (ROI) and increased capital accumulation. The hypothesis that economic inequality is a necessary precondition for economic growth has been a mainstay of liberal economic theory.
Recent research, particularly over the first two decades of the 21st century, has called this basic assumption into question. While growing inequality does have a positive correlation with economic growth under specific sets of conditions, inequality in general is not positively correlated with economic growth and, under some conditions, shows a negative correlation with economic growth.
Milanovic (2011) points out that overall, global inequality between countries is more important to growth of the world economy than inequality within countries. While global economic growth may be a policy priority, recent evidence about regional and national inequalities cannot be dismissed when more local economic growth is a policy objective.
The recent financial crisis and global recession hit countries and shook financial systems all over the world. This led to the implementation of large-scale fiscal expansionary interventions and, as a result, to massive public debt issuance in some countries.
Governmental bailouts of the banking system further burdened fiscal balances and raises considerable concern about the fiscal solvency of some countries. Most governments want to keep deficits under control but rolling back the expansionary measures or cutting spending and raising taxes implies an enormous wealth transfer from tax payers to the private financial sector.
Expansionary fiscal policies shift resources and causes worries about growing inequality within countries. Moreover, recent data confirm an ongoing trend of increasing income inequality since the early nineties. Increasing inequality within countries has been accompanied by a redistribution of economic resources between developed economies and emerging markets.
Davtyn, et al. (2014) studied the interaction of these fiscal conditions and changes in fiscal and economic policies with income inequality in the UK, Canada, and the US. They find income inequality has negative effect on economic growth in the case of the UK but a positive effect in the cases of the US and Canada.
Income inequality generally reduces government net lending/borrowing for all the countries. Economic growth, they find, leads to an increase of income inequality in the case of the UK and to the decline of inequality in the cases of the US and Canada.
At the same time, economic growth improves government net lending/borrowing in all the countries. Government spending leads to the decline in inequality in the UK but to its increase in the US and Canada.
Following the results of Alesina and Rodrick (1994), Bourguignon (2004), and Birdsall (2005) show that developing countries with high inequality tend to grow more slowly, Ortiz and Cummings (2011) show that developing countries with high inequality tend to grow more slowly. For 131 countries for which they could estimate the change in Gini index values between 1990 and 2008, they find that those countries that increased levels of inequality experienced slower annual per capita GDP growth over the same time period.
Noting a lack of data for national wealth, they build an index using Forbes list of billionaires by country normalized by GDP and validated through correlation with a Gini coefficient for wealth and the share of wealth going to the top decile.
They find that many countries generating low rates of economic growth are also characterized by a high level of wealth inequality with wealth concentration among a class of entrenched elites.
They conclude that extreme inequality in the distribution of wealth globally, regionally and nationally, coupled with the negative effects of higher levels of income disparities, should make us question current economic development approaches and examine the need to place equity at the center of the development agenda.
Ostry, et al. (2014) reject the hypothesis that there is a major trade-off between a reduction of income inequality (through income redistribution) and economic growth. If that were the case, they hold, then redistribution that reduces income inequality would on average be bad for growth, taking into account both the direct effect of higher redistribution and the effect of the resulting lower inequality.
Their research shows rather the opposite: increasing income inequality always has a significant and, in most cases, negative effect on economic growth while redistribution has an overall pro-growth effect (in one sample) or no growth effect. Their conclusion is that increasing inequality, particularly when inequality is already high, results in low growth, if any, and such growth may be unsustainable over long periods.
Piketty and Saez (2014) note that there are important differences between income and wealth inequality dynamics. First, wealth concentration is always much higher than income concentration.
The top 10 percent of wealth share typically falls in the 60 to 90 percent range of all wealth, whereas the top 10 percent income share is in the 30 to 50 percent range. The bottom 50 percent wealth share is always less than 5 percent, whereas the bottom 50 percent income share generally falls in the 20 to 30 percent range.
The bottom half of the population hardly owns any wealth, but it does earn appreciable income:The inequality of labor income can be high, but it is usually much less extreme. On average, members of the bottom half of the population, in terms of wealth, own less than one-tenth of the average wealth. The inequality of labor income can be high, but it is usually much less extreme.
Members of the bottom half of the population in income earn about half the average income. In sum, the concentration of capital ownership is always extreme, so that the very notion of capital is fairly abstract for large segments—if not the majority—of the population.
Piketty (2014) finds that wealth-income ratios, today, seem to be returning to very high levels in low economic growth countries, similar to what he calls the "classic patrimonial" wealth-based societies of the 19th century wherein a minority lives off its wealth while the rest of the population works for subsistence living. He surmises that wealth accumulation is high because growth is low.
See also:
- How Much More (Or Less) Would You Make If We Rolled Back Inequality? (January 2015). "How much more (or less) would families be earning today if inequality had remained flat since 1979?" National Public Radio
- OECD – Education GPS: Gender differences in education
- Civil rights
- Digital divide
- Educational inequality
- Gini coefficient
- Global justice
- Health equity
- Horizontal inequality
- List of countries by income inequality
- List of countries by distribution of wealth
- LGBT social movements
- Social apartheid
- Social equality
- Social exclusion
- Social mobility
- Social stratification
- Structural violence
- Tax evasion
- Triple oppression
Social Justice
- YouTube Video: What Does Social Justice Mean to YOU?
- YouTube Video: If I Could Change the World...
- YouTube Video: Listen: Dr. Maya Angelou Recites Her Poem "Phenomenal Woman"
Social justice is a concept of fair and just relations between the individual and society. This is measured by the explicit and tacit terms for the distribution of wealth, opportunities
for personal activity, and social privileges.
In Western as well as in older Asian cultures, the concept of social justice has often referred to the process of ensuring that individuals fulfill their societal roles and receive what was their due from society. In the current global grassroots movements for social justice, the emphasis has been on the breaking of barriers for social mobility, the creation of safety nets and economic justice.
Social justice assigns rights and duties in the institutions of society, which enables people to receive the basic benefits and burdens of cooperation. The relevant institutions often include the following:
Interpretations that relate justice to a reciprocal relationship to society are mediated by differences in cultural traditions, some of which emphasize the individual responsibility toward society and others the equilibrium between access to power and its responsible use.
Hence, social justice is invoked today while reinterpreting historical figures such as Bartolomé de las Casas, in philosophical debates about differences among human beings, in efforts for gender, racial and social equality, for advocating justice for migrants, prisoners, the environment, and the physically and developmentally disabled.
While the concept of social justice can be traced through the theology of Augustine of Hippo and the philosophy of Thomas Paine, the term "social justice" became used explicitly in the 1780s. A Jesuit priest named Luigi Taparelli is typically credited with coining the term, and it spread during the revolutions of 1848 with the work of Antonio Rosmini-Serbati.
However, recent research has proved that the use of the expression "social justice" is older (even before the 19th century). For example, in Anglo-America, the term appears in The Federalist Papers, No. 7: "We have observed the disposition to retaliation excited in Connecticut in consequence of the enormities perpetrated by the Legislature of Rhode Island; and we reasonably infer that, in similar cases, under other circumstances, a war, not of parchment, but of the sword, would chastise such atrocious breaches of moral obligation and social justice."
In the late industrial revolution, progressive American legal scholars began to use the term more, particularly Louis Brandeis and Roscoe Pound. From the early 20th century it was also embedded in international law and institutions; the preamble to establish the International Labour Organization recalled that "universal and lasting peace can be established only if it is based upon social justice."
In the later 20th century, social justice was made central to the philosophy of the social contract, primarily by John Rawls in A Theory of Justice (1971). In 1993, the Vienna Declaration and Programme of Action treats social justice as a purpose of human rights education.
Some authors such as Friedrich Hayek criticize the concept of social justice, arguing the lack of objective, accepted moral standard; and that while there is a legal definition of what is just and equitable "there is no test of what is socially unjust", and further that social justice is often used for the reallocation of resources based on an arbitrary standard which may in fact be inequitable or unjust.
Click on any of the following blue hyperlinks for more about Social Justice.
for personal activity, and social privileges.
In Western as well as in older Asian cultures, the concept of social justice has often referred to the process of ensuring that individuals fulfill their societal roles and receive what was their due from society. In the current global grassroots movements for social justice, the emphasis has been on the breaking of barriers for social mobility, the creation of safety nets and economic justice.
Social justice assigns rights and duties in the institutions of society, which enables people to receive the basic benefits and burdens of cooperation. The relevant institutions often include the following:
- taxation,
- social insurance,
- public health,
- public school,
- public services,
- labor law and regulation of markets, to ensure fair distribution of wealth, and equal opportunity.
Interpretations that relate justice to a reciprocal relationship to society are mediated by differences in cultural traditions, some of which emphasize the individual responsibility toward society and others the equilibrium between access to power and its responsible use.
Hence, social justice is invoked today while reinterpreting historical figures such as Bartolomé de las Casas, in philosophical debates about differences among human beings, in efforts for gender, racial and social equality, for advocating justice for migrants, prisoners, the environment, and the physically and developmentally disabled.
While the concept of social justice can be traced through the theology of Augustine of Hippo and the philosophy of Thomas Paine, the term "social justice" became used explicitly in the 1780s. A Jesuit priest named Luigi Taparelli is typically credited with coining the term, and it spread during the revolutions of 1848 with the work of Antonio Rosmini-Serbati.
However, recent research has proved that the use of the expression "social justice" is older (even before the 19th century). For example, in Anglo-America, the term appears in The Federalist Papers, No. 7: "We have observed the disposition to retaliation excited in Connecticut in consequence of the enormities perpetrated by the Legislature of Rhode Island; and we reasonably infer that, in similar cases, under other circumstances, a war, not of parchment, but of the sword, would chastise such atrocious breaches of moral obligation and social justice."
In the late industrial revolution, progressive American legal scholars began to use the term more, particularly Louis Brandeis and Roscoe Pound. From the early 20th century it was also embedded in international law and institutions; the preamble to establish the International Labour Organization recalled that "universal and lasting peace can be established only if it is based upon social justice."
In the later 20th century, social justice was made central to the philosophy of the social contract, primarily by John Rawls in A Theory of Justice (1971). In 1993, the Vienna Declaration and Programme of Action treats social justice as a purpose of human rights education.
Some authors such as Friedrich Hayek criticize the concept of social justice, arguing the lack of objective, accepted moral standard; and that while there is a legal definition of what is just and equitable "there is no test of what is socially unjust", and further that social justice is often used for the reallocation of resources based on an arbitrary standard which may in fact be inequitable or unjust.
Click on any of the following blue hyperlinks for more about Social Justice.
- History
- Contemporary theory
- Religious perspectives
- Social justice movements
- Criticism
- See also:
- Activism
- "Beyond Vietnam: A Time to Break Silence", an anti-Vietnam war and pro-social justice speech delivered by Martin Luther King, Jr. in 1967
- Counterculture of the 1960s
- Climate justice
- Economic justice
- Environmental justice
- Environmental racism
- Essentially contested concept
- Labor law and labor rights
- Left-wing politics
- Resource justice
- Right to education
- Right to health
- Right to housing
- Right to social security
- Socialism
- Social justice art
- Social justice warrior
- Social law
- Social work
- Solidarity
- Völkisch equality
- World Day of Social Justice
- All pages with titles beginning with Social justice
- All pages with titles containing Social justice
Race and ethnicity in the United States including a List by household income
Pictured below: by Racial and Ethnic Makeup (L) Population; (R) Household Income
- YouTube Video: Poll: A majority of Americans think President Trump is a racist
- YouTube Video: Former President Obama unleashes on Trump, GOP - Full speech from Illinois
- YouTube Video: Is America Dreaming?: Understanding Social Mobility*
Pictured below: by Racial and Ethnic Makeup (L) Population; (R) Household Income
Click here for a List of racial and ethnic groups in the United States by household income.
The United States has a racially and ethnically diverse population. The United States Census officially recognizes six racial categories:
The United States Census Bureau also classifies Americans as "Hispanic or Latino" and "Not Hispanic or Latino", which identifies Hispanic and Latino Americans as an ethnicity (not a race) distinct from others that composes the largest minority group in the nation.
The United States Supreme Court unanimously held that "race" is not limited to Census designations on the "race question" but extends to all ethnicities, and thus can include Jewish and Arab as well as Polish or Italian or Irish, etc. In fact, the Census asks an "Ancestry Question" which covers the broader notion of ethnicity initially in the 2000 Census long form and now in the American Community Survey.
White Americans are the racial majority. African Americans are the largest racial minority, amounting to 13.2% of the population. Hispanic and Latino Americans amount to 17% of the population, making up the largest ethnic minority. The White, non-Hispanic or Latino population make up 62.6% of the nation's total, with the total White population (including White Hispanics and Latinos) being 77%.
White Americans are the majority in every region except Hawaii, but contribute the highest proportion of the population in the Midwestern United States, at 85% per the Population Estimates Program (PEP), or 83% per the American Community Survey (ACS).
Non-Hispanic Whites make up 79% of the Midwest's population, the highest ratio of any region. However, 35% of White Americans (whether all White Americans or non-Hispanic/Latino only) live in the South, the most of any region.
55% of the African American population lives in the South. A plurality or majority of the other official groups reside in the West. This region is home to 42% of Hispanic and Latino Americans, 46% of Asian Americans, 48% of American Indians and Alaska Natives, 68% of Native Hawaiians and Other Pacific Islanders, 37% of the "two or more races" population (Multiracial Americans), and 46% of those designated "some other race".
Click on any of the following blue hyperlinks for moire about Race and Ethnicity in the United States:
The United States has a racially and ethnically diverse population. The United States Census officially recognizes six racial categories:
- White American,
- Black or African American,
- Native American and Alaska Native,
- Asian American,
- Native Hawaiian and Other Pacific Islander,
- People of two or more races; a category called "some other race" is also used in the census and other surveys, but is not official.
The United States Census Bureau also classifies Americans as "Hispanic or Latino" and "Not Hispanic or Latino", which identifies Hispanic and Latino Americans as an ethnicity (not a race) distinct from others that composes the largest minority group in the nation.
The United States Supreme Court unanimously held that "race" is not limited to Census designations on the "race question" but extends to all ethnicities, and thus can include Jewish and Arab as well as Polish or Italian or Irish, etc. In fact, the Census asks an "Ancestry Question" which covers the broader notion of ethnicity initially in the 2000 Census long form and now in the American Community Survey.
White Americans are the racial majority. African Americans are the largest racial minority, amounting to 13.2% of the population. Hispanic and Latino Americans amount to 17% of the population, making up the largest ethnic minority. The White, non-Hispanic or Latino population make up 62.6% of the nation's total, with the total White population (including White Hispanics and Latinos) being 77%.
White Americans are the majority in every region except Hawaii, but contribute the highest proportion of the population in the Midwestern United States, at 85% per the Population Estimates Program (PEP), or 83% per the American Community Survey (ACS).
Non-Hispanic Whites make up 79% of the Midwest's population, the highest ratio of any region. However, 35% of White Americans (whether all White Americans or non-Hispanic/Latino only) live in the South, the most of any region.
55% of the African American population lives in the South. A plurality or majority of the other official groups reside in the West. This region is home to 42% of Hispanic and Latino Americans, 46% of Asian Americans, 48% of American Indians and Alaska Natives, 68% of Native Hawaiians and Other Pacific Islanders, 37% of the "two or more races" population (Multiracial Americans), and 46% of those designated "some other race".
Click on any of the following blue hyperlinks for moire about Race and Ethnicity in the United States:
- Racial and ethnic categories
- Social definitions of race
- Historical trends and influences
- Racial makeup of the U.S. population
- Ancestry
- See also:
According to the U.S. Census Bureau, the United States had an estimated population of 328,239,523 in 2019 (with an unofficial statistical adjustment to 329,484,123 as of July 1, 2020 ahead of the final 2020 Census).
The United States is the third most populous country in the world, and current projections from the unofficial U.S. Population Clock show a total of just over 330 million residents.
All these figures exclude the population of five self-governing U.S. territories (Puerto Rico, Guam, the U.S. Virgin Islands, American Samoa and the Northern Mariana Islands) as well as several minor island possessions.
The Census Bureau showed a population increase of 0.75% for the twelve-month period ending in July 2012. Though high by industrialized country standards, this is below the world average annual rate of 1.1%. The total fertility rate in the United States estimated for 2019 is 1.706 children per woman, which is below the replacement fertility rate of approximately 2.1.
The U.S. population almost quadrupled during the 20th century—at a growth rate of about 1.3% a year—from about 76 million in 1900 to 281 million in 2000. It is estimated to have reached the 200 million mark in 1967, and the 300 million mark on October 17, 2006.
Foreign-born immigration has caused the U.S. population to continue its rapid increase, with the foreign-born population doubling from almost 20 million in 1990 to over 45 million in 2015, representing one-third of the population increase.
Population growth is fastest among minorities as a whole, and according to the Census Bureau's estimation for 2020, 50% of U.S. children under the age of 18 are members of ethnic minority groups.
White people constitute the majority of the U.S. population, with a total of about 234,370,202 or 73% of the population as of 2017. Non-Hispanic Whites make up 60.7% of the country's population. Their share of the U.S. population is expected to fall below 50% by 2045, primarily due to immigration and low birth rates.
Hispanic and Latino Americans accounted for 48% of the national population growth of 2.9 million between July 1, 2005, and July 1, 2006. Immigrants and their U.S.-born descendants are expected to provide most of the U.S. population gains in the decades ahead.
The Census Bureau projects a U.S. population of 417 million in 2060, a 38% increase from 2007 (301.3 million), and the United Nations estimates that the U.S. will be among the nine countries responsible for half the world's population growth by 2050, with its population being 402 million by then (an increase of 32% from 2007).
In an official census report, it was reported that 54.4% (2,150,926 out of 3,953,593) of births in 2010 were to "non-Hispanic whites". This represents an increase of 0.3% compared to the previous year, which was 54.1%.
Click on any of the following blue hyperlinks for more about the Demographics of the United States:
The United States is the third most populous country in the world, and current projections from the unofficial U.S. Population Clock show a total of just over 330 million residents.
All these figures exclude the population of five self-governing U.S. territories (Puerto Rico, Guam, the U.S. Virgin Islands, American Samoa and the Northern Mariana Islands) as well as several minor island possessions.
The Census Bureau showed a population increase of 0.75% for the twelve-month period ending in July 2012. Though high by industrialized country standards, this is below the world average annual rate of 1.1%. The total fertility rate in the United States estimated for 2019 is 1.706 children per woman, which is below the replacement fertility rate of approximately 2.1.
The U.S. population almost quadrupled during the 20th century—at a growth rate of about 1.3% a year—from about 76 million in 1900 to 281 million in 2000. It is estimated to have reached the 200 million mark in 1967, and the 300 million mark on October 17, 2006.
Foreign-born immigration has caused the U.S. population to continue its rapid increase, with the foreign-born population doubling from almost 20 million in 1990 to over 45 million in 2015, representing one-third of the population increase.
Population growth is fastest among minorities as a whole, and according to the Census Bureau's estimation for 2020, 50% of U.S. children under the age of 18 are members of ethnic minority groups.
White people constitute the majority of the U.S. population, with a total of about 234,370,202 or 73% of the population as of 2017. Non-Hispanic Whites make up 60.7% of the country's population. Their share of the U.S. population is expected to fall below 50% by 2045, primarily due to immigration and low birth rates.
Hispanic and Latino Americans accounted for 48% of the national population growth of 2.9 million between July 1, 2005, and July 1, 2006. Immigrants and their U.S.-born descendants are expected to provide most of the U.S. population gains in the decades ahead.
The Census Bureau projects a U.S. population of 417 million in 2060, a 38% increase from 2007 (301.3 million), and the United Nations estimates that the U.S. will be among the nine countries responsible for half the world's population growth by 2050, with its population being 402 million by then (an increase of 32% from 2007).
In an official census report, it was reported that 54.4% (2,150,926 out of 3,953,593) of births in 2010 were to "non-Hispanic whites". This represents an increase of 0.3% compared to the previous year, which was 54.1%.
Click on any of the following blue hyperlinks for more about the Demographics of the United States:
- Population
- Vital statistics
- Historical data
- Population centers
- Race and ethnicity
- LGBT Americans
- Foreign-born population
- Citizens living abroad
- Religion
- Income
- Economic class
- Generational cohorts
- Demographic statistics
- See also:
- Lists
- Income
- United States Census Bureau
- New York Times: "Mapping the 2010 U.S. Census"
- 2000 Census of Population and Housing United States, U.S. Census Bureau
- Asian-Nation: Demographics of Asian American /2006-07-04-us-population_x.htm?csp=34 Countdown to 300 million
- Census Ancestry Map
- USA Today 2004 Election County by County Map
- Google – public data "Population in the U.S.A."
Wealth Inequality in the United States
- YouTube Video How wealth inequality is dangerous for America
- YouTube Video: Wealth Gap: Last Week Tonight with John Oliver (HBO)
- YouTube Video: Wealth Inequality in America
The above infographic explores the rise in income inequality in the United States. There is a particular emphasis on the rise over the last four decades, which have seen income inequality rise and rise. The whole topic is now very much a part of regular public discussion. Read this infographic to learn more about what income inequality is, contributing factors, backing statistics and facts and also how the US compares with trends around the world.
What is Income Inequality?
Income inequality is a broad term used to measure the inequality of household/individual income of various members within an economy. Income has multiple streams including wages, salaries, interest, dividends, rent received, profits earned, benefits received, etc.
Income inequality is often represented in a statistical form, measuring percentage of incomes for different groups vs the entire population. For example, an Economy Policy Institute (EPI) survey in the US in 2015 revealed that a family in the top 1 percent nationally received, on average, 26.3 times as much income as a family in the bottom 99 percent.
10 Factors Impacting US Income Inequality:
There are many contributing factors as to why US income inequality has grown. Listed below are a range of factors as to why this has grown in the past four decades:
It’s important to emphasize that income inequality is built up literally by hundreds of different factors. We believe the ten listed above are certainly some of the most important factors, affecting income equality within the United States.
___________________________________________________________________________
Wikipedia:
Wealth inequality in the United States (also known as the wealth gap) is the unequal distribution of assets among residents of the United States. Wealth includes the values of homes, automobiles, personal valuables, businesses, savings, and investments.
The net worth of U.S. households and non-profit organizations was $94.7 trillion in the first quarter of 2017, a record level both in nominal terms and purchasing power parity. If divided equally among 124 million U.S. households, this would be $760,000 per family; however, the bottom 50% of families, representing 62 million American households, average $11,000 net worth. From an international perspective, the difference in US median and mean wealth per adult is over 600%.
Just prior to President Obama's 2014 State of the Union Address, media reported that the top wealthiest 1% possess 40% of the nation's wealth; the bottom 80% own 7%; similarly, but later, the media reported, the "richest 1 percent in the United States now own more additional income than the bottom 90 percent".
The gap between the top 10% and the middle class is over 1,000%; that increases another 1,000% for the top 1%. The average employee "needs to work more than a month to earn what the CEO earns in one hour." Although different from income inequality, the two are related.
In Inequality for All—a 2013 documentary with Robert Reich in which he argued that income inequality is the defining issue for the United States—Reich states that 95% of economic gains went to the top 1% net worth (HNWI) since 2009 when the recovery was to believed have started. More recently, in 2017, an Oxfam study found that eight rich people, six of them Americans, own as much combined wealth as half the human race.
From 1989 to 2018 the top 1 percent increased its total net worth by $21 trillion. The bottom 50 percent saw its net worth decrease by $900 billion over the same period. In 2018 dollars.
A 2011 study found that US citizens across the political spectrum dramatically underestimate the current US wealth inequality and would prefer a far more egalitarian distribution of wealth.
Wealth is usually not used for daily expenditures or factored into household budgets, but combined with income it comprises the family's total opportunity to secure a desired stature and standard of living, or pass their class status along to one's children.
Moreover, wealth provides for both short- and long-term financial security, bestows social prestige, and contributes to political power, and can be used to produce more wealth.
Hence, wealth possesses a psychological element that awards people the feeling of agency, or the ability to act. The accumulation of wealth grants more options and eliminates restrictions about how one can live life.
Dennis Gilbert asserts that the standard of living of the working and middle classes is dependent upon income and wages, while the rich tend to rely on wealth, distinguishing them from the vast majority of Americans. A September 2014 study by Harvard Business School declared that the growing disparity between the very wealthy and the lower and middle classes is no longer sustainable
Statistics:
In 2007, the top 20% wealthiest possessed 80% of all financial assets. In 2007 the richest 1% of the American population owned 35% of the country's total wealth, and the next 19% owned 51%. Thus, the top 20% of Americans owned 86% of the country's wealth and the bottom 80% of the population owned 14%.
In 2011, financial inequality was greater than inequality in total wealth, with the top 1% of the population owning 43%, the next 19% of Americans owning 50%, and the bottom 80% owning 7%. However, after the Great Recession which started in 2007, the share of total wealth owned by the top 1% of the population grew from 35% to 37%, and that owned by the top 20% of Americans grew from 86% to 88%. The Great Recession also caused a drop of 36% in median household wealth, but a drop of only 11% for the top 1%, further widening the gap between the top 1% and the bottom 99%.
According to PolitiFact and others, in 2011 the 400 wealthiest Americans have more wealth than half of all Americans combined. Inherited wealth may help explain why many Americans who have become rich may have had a substantial head start. In September 2012, according to the Institute for Policy Studies, over 60 percent of the Forbes richest 400 Americans grew up in substantial privilege.
In 2013 wealth inequality in the U.S. was greater than in most developed countries other than Switzerland and Denmark. In the United States, the use of offshore holdings is exceptionally small compared to Europe, where much of the wealth of the top percentiles is kept in offshore holdings.
While the statistical problem is European wide, in Southern Europe statistics become even more unreliable. Fewer than a thousand people in Italy have declared incomes of more than 1 million euros. Former Prime Minister of Italy described tax evasion as a "national pastime".
According to a 2014 Credit Suisse study, the ratio of wealth to household income is the highest it has been since the Great Depression.
However, according to the Federal Reserve, "For most households, pensions and Social Security are the most important sources of income during retirement, and the promised benefit stream constitutes a sizable fraction of household wealth" and "including pensions and Social Security in net worth makes the distribution more even". A September 2017 study by the Federal Reserve reported that the top 1% owned 38.5% of the country's wealth in 2016.
According to a June 2017 report by the Boston Consulting Group, around 70% of the nation's wealth will be in the hands of millionaires and billionaires by 2021.
Early 20th century:
Pioneering work by Simon Kuznets using income tax records and his own well-researched estimates of national income showed a reduction of about 10% in the portion of national income going to the top 10%, a reduction from about 45–50% in 1913 to about 30–35% in 1948.
This period spans both The Great Depression and World War II, events with significant economic consequences. This is called the Great Compression.
Wealth and Income:
There is an important distinction between income and wealth. Income refers to a flow of money over time in the form of a rate (per hour, per week, or per year); wealth is a collection of assets owned minus liabilities. In essence, income is specifically what people receive through work, retirement, or social welfare whereas wealth is what people own.
While the two are seemingly related, income inequality alone is insufficient for understanding economic inequality for two reasons:
The United States Census Bureau formally defines income as received on a regular basis (exclusive of certain money receipts such as capital gains) before payments for personal income taxes, social security, union dues, medicare deductions, etc.
By this official measure, the wealthiest families may have low income, but the value of their assets earns enough money to support their lifestyle. Dividends from trusts or gains in the stock market do not fall under the definition of income but are the primary money flows for the wealthy. Retired people also have little income but usually have a higher net worth because of money saved over time.
Additionally, income does not capture the extent of wealth inequality. Wealth is derived over time from the collection of income earnings and growth of assets. The income of one year cannot encompass the accumulation over a lifetime. Income statistics view too narrow a time span for it to be an adequate indicator of financial inequality. For example, the Gini coefficient for wealth inequality increased from 0.80 in 1983 to 0.84 in 1989. In the same year,
1989, the Gini coefficient for income was only 0.52. The Gini coefficient is an economic tool on a scale from 0 to 1 that measures the level of inequality. 1 signifies perfect inequality and 0 represents perfect equality. From this data, it is evident that in 1989 there was a discrepancy about the level of economic disparity with the extent of wealth inequality significantly higher than income inequality.
Recent research shows that many households, in particular those headed by young parents (younger than 35), minorities, and individuals with low educational attainment, display very little accumulation. Many have no financial assets and their total net worth is also low.
According to the Congressional Budget Office, between 1979 and 2007 incomes of the top 1% of Americans grew by an average of 275%. ...
(Note: The IRS insists that comparisons of adjusted gross income pre-1987 and post-1987 are complicated by large changes in the definition of AGI led to households in the top income quintile reporting a lot more of their income in their individual income tax form's AGI, rather than reporting their business income in separate corporate tax returns, or not reporting certain non-taxable income in their AGI at all, such as municipal bond income.
Anyone who wants to discuss incomes in the U.S. fairly must include a chart of all available data split by quantile up to the mid-1980s. That should be followed by a chart from 1990 to 2011. The five-year gap would avoid the major AGI definition changes.
The big picture of this subject is not just a segment of all available data starting in 1979, especially after the IRS warned about the large AGI definition changes in the late 1980s). In addition, IRS studies consistently show a majority of households in the top income quintile have moved to a lower quantile within one decade.
There are even more changes to households in the top 1%. Without including those data here, a reader is likely to assume households in the Top 1% are almost the same from year to year.) In 2009, people in the top 1% of taxpayers made $343,927 or more.
According to US economist Joseph Stiglitz the richest 1% of Americans gained 93% of the additional income created in 2010. A study by Emmanuel Saez and Piketty showed that the top 10 percent of earners took more than half of the country's total income in 2012, the highest level recorded since the government began collecting the relevant data a century ago.
People in the top one percent were three times more likely to work more than 50 hours a week, were more likely to be self-employed, and earned a fifth of their income as capital income. The top one percent was composed of many professions and had an annual turnover rate of more than 25%. The five most common professions were managers, physicians, administrators, lawyers, and teachers.
In the book Modern Labor Economics: Theory and Public Policy, it is noted that in the United States all income that employees received from their employers in 2012 was 8.6 trillion dollars while the amount of money received from all other sources of personal income in that year came to 5.3 trillion dollars. This makes the relationship of employee to employer and vocational employment in general of paramount importance in the United States.
Gender Pay Inequality
Further information: Gender pay gap
Wealth inequality and child poverty:
Further information: Child poverty in the United States
In 2013 UNICEF data on the well-being of children in 35 developed nations ranked the United States at 34 out of 35 (Romania is the worst). This may reflect growing income inequality.
U.S. stock market ownership distribution:
In March 2017, NPR summarized the distribution of U.S. stock market ownership (direct and indirect through mutual funds) in the U.S., which is highly concentrated among the wealthiest families:
The Federal Reserve reported the median value of stock ownership by income group for 2016:
NPR reported that when politicians reference the stock market as a measure of economic success, that success is not relevant to nearly half of Americans. Further, more than one-third of Americans who work full-time have no access to pensions or retirement accounts such as 401(k)s that derive their value from financial assets like stocks and bonds.
The NYT reported that the percentage of workers covered by generous defined-benefit pension plans has declined from 62% in 1983 to 17% by 2016. While some economists consider an increase in the stock market to have a "wealth effect" that increases economic growth, economists like Former Dallas Federal Reserve Bank President Richard Fisher believe those effects are limited.
Causes of wealth inequality:
Main article: Causes of income inequality in the United States
Essentially, the wealthy possess greater financial opportunities that allow their money to make more money. Earnings from the stock market or mutual funds are reinvested to produce a larger return. Over time, the sum that is invested becomes progressively more substantial.
Those who are not wealthy, however, do not have the resources to enhance their opportunities and improve their economic position. Rather, "after debt payments, poor families are constrained to spend the remaining income on items that will not produce wealth and will depreciate over time."
Scholar David B. Grusky notes that "62 percent of households headed by single parents are without savings or other financial assets." Net indebtedness generally prevents the poor from having any opportunity to accumulate wealth and thereby better their conditions.
Economic inequality is a result of difference in income. Factors that contribute to this gap in wages are things such as level of education, labor market demand and supply, gender differences, growth in technology, and personal abilities. The quality and level of education that a person has often corresponds to their skill level, which is justified by their income.
Wages are also determined by the "market price of a skill" at that current time.
Although gender inequality is a separate social issue, it plays a role in economic inequality. According to the U.S. Census Report, in America the median full-time salary for women is 77 percent of that for men.
Also contributing to the wealth inequality in the U.S., both unskilled and skilled workers are being replaced by machinery. The Seven Pillars Institute for Global Finance and Ethics argues that because of this "technological advance", the income gap between workers and owners has widened.
Income inequality contributes to wealth inequality. For example, economist Emmanuel Saez wrote in June 2016 that the top 1% of families captured 52% of the total real income (GDP) growth per family from 2009-2015. From 2009 to 2012, the top 1% captured 91% of the income gains.
Notably, for both the wealthy and not-wealthy, the process of accumulation or debt is cyclical. The rich use their money to earn larger returns and the poor have no savings with which to produce returns or eliminate debt. Unlike income, both facets are generational.
Wealthy families pass down their assets allowing future generations to develop even more wealth. The poor, on the other hand, are less able to leave inheritances to their children leaving the latter with little or no wealth on which to build...This is another reason why wealth inequality is so important, its accumulation has direct implications for economic inequality among the children of today's families.
Corresponding to financial resources, the wealthy strategically organize their money so that it will produce profit. Affluent people are more likely to allocate their money to financial assets such as stocks, bonds, and other investments which hold the possibility of capital appreciation.
Those who are not wealthy are more likely to have their money in savings accounts and home ownership. This difference comprises the largest reason for the continuation of wealth inequality in America: the rich are accumulating more assets while the middle and working classes are just getting by.
As of 2007, the richest 1% held about 38% of all privately held wealth in the United States. while the bottom 90% held 73.2% of all debt. According to The New York Times, the richest 1 percent in the United States now own more wealth than the bottom 90 percent.
However, other studies argue that higher average savings rate will contribute to the reduction of the share of wealth owned by the rich. The reason is that the rich in wealth are not necessarily the individuals with the highest income. Therefore, the relative wealth share of poorer Quantile of the population would increase if the savings rate of income is very large, although the absolute difference from the wealthiest will increase.
As the price of commodities increases because of inflation, a larger percentage of lower-class people's money is spent on things they need to survive and go to work, such as food and gasoline. Most of the working poor are paid fixed hourly wages that do not keep up with rises in prices, so every year an increasing percentage of their income is consumed until they have to go into debt just to survive. At this point, their little wealth is owed to lenders and banking institutions.
The nature of tax policies in America has been suggested by economists and politicians such as Emmanuel Saez, Thomas Piketty, and Barack Obama to perpetuate economic inequality in America by steering large sums of wealth into the hands of the wealthiest Americans. The mechanism for this is that when the wealthy avoid paying taxes, wealth concentrates to their coffers and the poor go into debt.
The economist Joseph Stiglitz argues that "Strong unions have helped to reduce inequality, whereas weaker unions have made it easier for CEOs, sometimes working with market forces that they have helped shape, to increase it." The long fall in unionization in the U.S. since WWII has seen a corresponding rise in the inequality of wealth and income.
Racial disparities:
The wealth gap between white and black families nearly tripled from $85,000 in 1984 to $236,500 in 2009.
There are many causes, including years of home ownership, household income, unemployment, and education, but inheritance might be the most important. Inheritance can directly link the disadvantaged economic position and prospects of today's blacks to the disadvantaged positions of their parents' and grandparents' generations.
According to a report done by Robert B. Avery and Michael S. Rendall, "one in three white households will receive a substantial inheritance during their lifetime compared to only one in ten black households."
This relative lack of inheritance that has been observed among African Americans can be attributed in large part to factors such as unpaid labor (slavery), violent destruction of personal property in incidents such as Red Summer of 1919, unequal opportunity in education and employment (racial discrimination), and more recent policies such as redlining and planned shrinkage.
Other ethnic minorities, particularly those with darker complexions, have at times faced many of these same adversities to various degrees.
The article "America's Financial Divide" added context to racial wealth inequality stating "…nearly 96.1 percent of the 1.2 million households in the top one percent by income were white, a total of about 1,150,000 households. In addition, these families were found to have a median net asset worth of $8.3 million. In stark contrast, in the same piece, black households were shown as a mere 1.4 percent of the top one percent by income, that's only 16,800 homes.
In addition, their median net asset worth was just $1.2 million. Using this data as an indicator only several thousand of the over 14 million African American households have more than $1.2 million in net assets… Relying on data from Credit Suisse and Brandeis University's Institute on Assets and Social Policy, the Harvard Business Review in the article "How America's Wealthiest Black Families Invest Money" recently took the analysis above a step further.
In the piece the author stated "If you're white and have a net worth of about $356,000, that's good enough to put you in the 72nd percentile of white families. If you're black, it's good enough to catapult you into the 95th percentile." This means 28 percent of the total 83 million white homes, or over 23 million white households, have more than $356,000 in net assets. While only 700,000 of the 14 million black homes have more than $356,000 in total net worth."
According to Inequality.org, the median black family is actually only worth $1,700 when you deduct these durables. In contrast, the median white family holds $116,800 of wealth using the same accounting methods.
Some historical context: In South Africa, during the atrocities of apartheid, the median black family held about 7 percent of typical white South African family net worth. Today, using Wolff’s analysis, the median African American family holds a mere 1.5 percent of median white American family wealth.
A recent piece on Eurweb/Electronic Urban Report "Black Wealth Hardly Exists, Even When You Include NBA, NFL and Rap Stars" stated this about the difference between black middle class families and white middle class families. "Going even further into the data, a recent study by the Institute for Policy Studies (IPS) and the Corporation For Economic Development (CFED) found that it would take 228 years for the average black family to amass the same level of wealth the average white family holds today in 2016.
All while white families create even more wealth over those same two hundred years. In fact, this is a gap that will never close if America stays on its current economic path. According to the Institute on Assets and Social Policy, for each dollar of increase in average income an African American household saw from 1984 to 2009 just $0.69 in additional wealth was generated, compared with the same dollar in increased income creating an additional $5.19 in wealth for a similarly situated white household."
Author Lilian Singh wrote on why the perceptions about black life created by media are misleading in the American Prospect piece "Black Wealth On TV: Realities Don’t Match Perceptions". "Black programming features TV shows that collectively create false perceptions of wealth for African-American families. The images displayed are in stark contrast to the economic conditions the average black family is battling each day."
In an article on Huffington Post by Antonio Moore "The Decadent Veil: Black America's Wealth Illusion" the question of inequity is taken another critical step forward and the piece digs into how celebrity is masking this massive inequality.
Excerpt:
According to an article by the Pew research Center, the median wealth of non-Hispanic black households fell nearly 38% from 2010 to 2013. During that time, the median wealth of those households fell from $16,600 to $13,700. The median wealth of Hispanic families fell 14.3 % as well, from $16,000 to $14,000.
Despite the median net worth of all households in the United States decreasing with time, as of 2013, white households had a median net worth of $141,900 while black house households had a median net worth of just $11,000.
Hispanic households had a median net worth of just $13,700 over that time as well.
Effect on democracy:
See also: Income inequality in the United States § Effects on democracy and society
A 2014 study by researchers at Princeton and Northwestern concludes that government policies reflect the desires of the wealthy, and that the vast majority of American citizens have "minuscule, near-zero, statistically non-significant impact upon public policy … when a majority of citizens disagrees with economic elites and/or with organized interests, they generally lose."
When Fed chair Janet Yellen was questioned by Bernie Sanders about the study at a congressional hearing in May 2014, she responded "There’s no question that we’ve had a trend toward growing inequality" and that this trend "can shape [and] determine the ability of different groups to participate equally in a democracy and have grave effects on social stability over time."
In Capital in the Twenty-First Century, French economist Thomas Piketty argues that "extremely high levels" of wealth inequality are "incompatible with the meritocratic values and principles of social justice fundamental to modern democratic societies" and that "the risk of a drift towards oligarchy is real and gives little reason for optimism about where the United States is headed."
According to Jedediah Purdy, a researcher at the Duke School of Law, the inequality of wealth in the United States has constantly opened the eyes of the many problems and shortcomings of its financial system over at least the last fifty years of the debate. For years, people believed that distributive justice would produce a sustainable level of wealth inequality. It was also thought that a certain state would be able to effectively diminish the amount of inequality that would occur.
Something that was for the most part not expected is the fact that the inequality levels created by the growing markets would lessen the power of that state and prevent the majority of the political community from actually being able to deliver on its plans of distributive justice, however it has just lately come to attention of the mass majority.
Effect on health and well being:
The 2019 World Happiness Report shows the US slipping to 19th place due to increasing wealth inequality, along with rising healthcare costs, surging addiction rates, and an unhealthy work–life balance.
Proposals to reduce wealth inequality:
Taxation of wealth:
Senator Elizabeth Warren proposed an annual tax on wealth in January 2019, specifically a 2% tax for wealth over $50 million and another 1% surcharge on wealth over $1 billion.
Wealth is defined as including all asset classes, including financial assets and real estate.
Economists Emmanuel Saez and Gabriel Zucman estimated that about 75,000 households (less than 0.1%) would pay the tax. The tax would raise around $2.75 trillion over 10 years, roughly 1% GDP on average per year. This would raise the total tax burden for those subject to the wealth tax from 3.2% of their wealth under current law to about 4.3% on average, versus the 7.2% for the bottom 99% families.
For scale, the federal budget deficit in 2018 was 3.9% GDP and is expected to rise towards 5% GDP over the next decade. The plan received both praise and criticism. Two billionaires, Michael Bloomberg and Howard Schultz, criticized the proposal as "unconstitutional" and "ridiculous," respectively. Warren was not surprised by this reaction, stating: "Another billionaire who thinks that billionaires shouldn't pay more in taxes." Economist Paul Krugman wrote in January 2019 that polls indicate the idea of taxing the rich more is very popular.
Limit or tax stock buybacks:
Senators Charles Schumer and Bernie Sanders advocated limiting stock buybacks in January 2019. They explained that from 2008-2017, 466 of the S&P 500 companies spent $4 trillion on stock buybacks, about 50% of profits, with another 40% going to dividends.
During 2018 alone, a record $1 trillion was spent on buybacks. Stock buybacks shift wealth upwards, because the top 1% own about 40% of shares and the top 10% own about 85%.
Further, corporations directing profits to shareholders are not reinvesting the money in the firm or paying workers more. They wrote: "If corporations continue to purchase their own stock at this rate, income disparities will continue to grow, productivity will suffer, the long-term strength of companies will diminish — and the American worker will fall further behind."
Their proposed legislation would prohibit buybacks unless the corporation has taken other steps first, such as paying workers more, providing more benefits such as healthcare and pensions, and investing in the community. To prevent corporations from shifting from buybacks to dividends, they proposed limiting dividends, perhaps by taking action through the tax code.
See also:
What is Income Inequality?
Income inequality is a broad term used to measure the inequality of household/individual income of various members within an economy. Income has multiple streams including wages, salaries, interest, dividends, rent received, profits earned, benefits received, etc.
Income inequality is often represented in a statistical form, measuring percentage of incomes for different groups vs the entire population. For example, an Economy Policy Institute (EPI) survey in the US in 2015 revealed that a family in the top 1 percent nationally received, on average, 26.3 times as much income as a family in the bottom 99 percent.
10 Factors Impacting US Income Inequality:
There are many contributing factors as to why US income inequality has grown. Listed below are a range of factors as to why this has grown in the past four decades:
- Deregulation – the deregulation of financial markets (in particular) and other barriers to enterprise has caused a shift from wage income to profit based income
- Education access – certain areas of society do not have comparable access to quality education, particularly in secondary schools, this on average reduces income later in life
- Executive talent premiums – workers with higher levels of education and skills paid large wage premiums, particularly in technology (demand exceeds supply)
- Immigration – the immigration of many low-skilled workers in recent decades has caused increased demand for low paid jobs, causing a reduction in hourly rates
- Reduced labor union influence – the reduced influence of labor unions has been another factor causing reduced/declining wages amongst lower paid workers
- Stagnating lower income wages – due to a combination of factors (many in this list), lower-income levels have stagnated in recent decades
- Technology automation – advances in technology have causes automation of many low-skill processes and functions which reduces demand (and wages) for low-skilled workers
- Trade globalization – a rise in trade between the US and the rest of the world (particularly China) has caused reduced demand for labour (particularly in low paid industries)
- Transfer payments – after tax welfare payments such as social security, unemployment compensation and other benefits have eroded in recent decades
- US Tax policies – the “wealthy elite” now pay a lower proportion of tax, particularly in areas such as capital gains, investment income and savings incentives
It’s important to emphasize that income inequality is built up literally by hundreds of different factors. We believe the ten listed above are certainly some of the most important factors, affecting income equality within the United States.
___________________________________________________________________________
Wikipedia:
Wealth inequality in the United States (also known as the wealth gap) is the unequal distribution of assets among residents of the United States. Wealth includes the values of homes, automobiles, personal valuables, businesses, savings, and investments.
The net worth of U.S. households and non-profit organizations was $94.7 trillion in the first quarter of 2017, a record level both in nominal terms and purchasing power parity. If divided equally among 124 million U.S. households, this would be $760,000 per family; however, the bottom 50% of families, representing 62 million American households, average $11,000 net worth. From an international perspective, the difference in US median and mean wealth per adult is over 600%.
Just prior to President Obama's 2014 State of the Union Address, media reported that the top wealthiest 1% possess 40% of the nation's wealth; the bottom 80% own 7%; similarly, but later, the media reported, the "richest 1 percent in the United States now own more additional income than the bottom 90 percent".
The gap between the top 10% and the middle class is over 1,000%; that increases another 1,000% for the top 1%. The average employee "needs to work more than a month to earn what the CEO earns in one hour." Although different from income inequality, the two are related.
In Inequality for All—a 2013 documentary with Robert Reich in which he argued that income inequality is the defining issue for the United States—Reich states that 95% of economic gains went to the top 1% net worth (HNWI) since 2009 when the recovery was to believed have started. More recently, in 2017, an Oxfam study found that eight rich people, six of them Americans, own as much combined wealth as half the human race.
From 1989 to 2018 the top 1 percent increased its total net worth by $21 trillion. The bottom 50 percent saw its net worth decrease by $900 billion over the same period. In 2018 dollars.
A 2011 study found that US citizens across the political spectrum dramatically underestimate the current US wealth inequality and would prefer a far more egalitarian distribution of wealth.
Wealth is usually not used for daily expenditures or factored into household budgets, but combined with income it comprises the family's total opportunity to secure a desired stature and standard of living, or pass their class status along to one's children.
Moreover, wealth provides for both short- and long-term financial security, bestows social prestige, and contributes to political power, and can be used to produce more wealth.
Hence, wealth possesses a psychological element that awards people the feeling of agency, or the ability to act. The accumulation of wealth grants more options and eliminates restrictions about how one can live life.
Dennis Gilbert asserts that the standard of living of the working and middle classes is dependent upon income and wages, while the rich tend to rely on wealth, distinguishing them from the vast majority of Americans. A September 2014 study by Harvard Business School declared that the growing disparity between the very wealthy and the lower and middle classes is no longer sustainable
Statistics:
In 2007, the top 20% wealthiest possessed 80% of all financial assets. In 2007 the richest 1% of the American population owned 35% of the country's total wealth, and the next 19% owned 51%. Thus, the top 20% of Americans owned 86% of the country's wealth and the bottom 80% of the population owned 14%.
In 2011, financial inequality was greater than inequality in total wealth, with the top 1% of the population owning 43%, the next 19% of Americans owning 50%, and the bottom 80% owning 7%. However, after the Great Recession which started in 2007, the share of total wealth owned by the top 1% of the population grew from 35% to 37%, and that owned by the top 20% of Americans grew from 86% to 88%. The Great Recession also caused a drop of 36% in median household wealth, but a drop of only 11% for the top 1%, further widening the gap between the top 1% and the bottom 99%.
According to PolitiFact and others, in 2011 the 400 wealthiest Americans have more wealth than half of all Americans combined. Inherited wealth may help explain why many Americans who have become rich may have had a substantial head start. In September 2012, according to the Institute for Policy Studies, over 60 percent of the Forbes richest 400 Americans grew up in substantial privilege.
In 2013 wealth inequality in the U.S. was greater than in most developed countries other than Switzerland and Denmark. In the United States, the use of offshore holdings is exceptionally small compared to Europe, where much of the wealth of the top percentiles is kept in offshore holdings.
While the statistical problem is European wide, in Southern Europe statistics become even more unreliable. Fewer than a thousand people in Italy have declared incomes of more than 1 million euros. Former Prime Minister of Italy described tax evasion as a "national pastime".
According to a 2014 Credit Suisse study, the ratio of wealth to household income is the highest it has been since the Great Depression.
However, according to the Federal Reserve, "For most households, pensions and Social Security are the most important sources of income during retirement, and the promised benefit stream constitutes a sizable fraction of household wealth" and "including pensions and Social Security in net worth makes the distribution more even". A September 2017 study by the Federal Reserve reported that the top 1% owned 38.5% of the country's wealth in 2016.
According to a June 2017 report by the Boston Consulting Group, around 70% of the nation's wealth will be in the hands of millionaires and billionaires by 2021.
Early 20th century:
Pioneering work by Simon Kuznets using income tax records and his own well-researched estimates of national income showed a reduction of about 10% in the portion of national income going to the top 10%, a reduction from about 45–50% in 1913 to about 30–35% in 1948.
This period spans both The Great Depression and World War II, events with significant economic consequences. This is called the Great Compression.
Wealth and Income:
There is an important distinction between income and wealth. Income refers to a flow of money over time in the form of a rate (per hour, per week, or per year); wealth is a collection of assets owned minus liabilities. In essence, income is specifically what people receive through work, retirement, or social welfare whereas wealth is what people own.
While the two are seemingly related, income inequality alone is insufficient for understanding economic inequality for two reasons:
- It does not accurately reflect an individual's economic position
- Income does not portray the severity of financial inequality in the United States.
The United States Census Bureau formally defines income as received on a regular basis (exclusive of certain money receipts such as capital gains) before payments for personal income taxes, social security, union dues, medicare deductions, etc.
By this official measure, the wealthiest families may have low income, but the value of their assets earns enough money to support their lifestyle. Dividends from trusts or gains in the stock market do not fall under the definition of income but are the primary money flows for the wealthy. Retired people also have little income but usually have a higher net worth because of money saved over time.
Additionally, income does not capture the extent of wealth inequality. Wealth is derived over time from the collection of income earnings and growth of assets. The income of one year cannot encompass the accumulation over a lifetime. Income statistics view too narrow a time span for it to be an adequate indicator of financial inequality. For example, the Gini coefficient for wealth inequality increased from 0.80 in 1983 to 0.84 in 1989. In the same year,
1989, the Gini coefficient for income was only 0.52. The Gini coefficient is an economic tool on a scale from 0 to 1 that measures the level of inequality. 1 signifies perfect inequality and 0 represents perfect equality. From this data, it is evident that in 1989 there was a discrepancy about the level of economic disparity with the extent of wealth inequality significantly higher than income inequality.
Recent research shows that many households, in particular those headed by young parents (younger than 35), minorities, and individuals with low educational attainment, display very little accumulation. Many have no financial assets and their total net worth is also low.
According to the Congressional Budget Office, between 1979 and 2007 incomes of the top 1% of Americans grew by an average of 275%. ...
(Note: The IRS insists that comparisons of adjusted gross income pre-1987 and post-1987 are complicated by large changes in the definition of AGI led to households in the top income quintile reporting a lot more of their income in their individual income tax form's AGI, rather than reporting their business income in separate corporate tax returns, or not reporting certain non-taxable income in their AGI at all, such as municipal bond income.
Anyone who wants to discuss incomes in the U.S. fairly must include a chart of all available data split by quantile up to the mid-1980s. That should be followed by a chart from 1990 to 2011. The five-year gap would avoid the major AGI definition changes.
The big picture of this subject is not just a segment of all available data starting in 1979, especially after the IRS warned about the large AGI definition changes in the late 1980s). In addition, IRS studies consistently show a majority of households in the top income quintile have moved to a lower quantile within one decade.
There are even more changes to households in the top 1%. Without including those data here, a reader is likely to assume households in the Top 1% are almost the same from year to year.) In 2009, people in the top 1% of taxpayers made $343,927 or more.
According to US economist Joseph Stiglitz the richest 1% of Americans gained 93% of the additional income created in 2010. A study by Emmanuel Saez and Piketty showed that the top 10 percent of earners took more than half of the country's total income in 2012, the highest level recorded since the government began collecting the relevant data a century ago.
People in the top one percent were three times more likely to work more than 50 hours a week, were more likely to be self-employed, and earned a fifth of their income as capital income. The top one percent was composed of many professions and had an annual turnover rate of more than 25%. The five most common professions were managers, physicians, administrators, lawyers, and teachers.
In the book Modern Labor Economics: Theory and Public Policy, it is noted that in the United States all income that employees received from their employers in 2012 was 8.6 trillion dollars while the amount of money received from all other sources of personal income in that year came to 5.3 trillion dollars. This makes the relationship of employee to employer and vocational employment in general of paramount importance in the United States.
Gender Pay Inequality
Further information: Gender pay gap
Wealth inequality and child poverty:
Further information: Child poverty in the United States
In 2013 UNICEF data on the well-being of children in 35 developed nations ranked the United States at 34 out of 35 (Romania is the worst). This may reflect growing income inequality.
U.S. stock market ownership distribution:
In March 2017, NPR summarized the distribution of U.S. stock market ownership (direct and indirect through mutual funds) in the U.S., which is highly concentrated among the wealthiest families:
- 52% of U.S. adults owned stock in 2016. Ownership peaked at 65% in 2007 and fell significantly due to the Great Recession.
- As of 2013, the top 1% of households owned 38% of stock market wealth.
- As of 2013, the top 10% own 81% of stock wealth, the next 10% (80th to 90th percentile) own 11% and the bottom 80% own 8%.
The Federal Reserve reported the median value of stock ownership by income group for 2016:
- Bottom 20% own $5,800.
- 20th-40th percentile own $10,000.
- 40th to 60th percentile own $15,500.
- 60th to 80th percentile own $31,700.
- 80th to 89th percentile own $82,000.
- Top 10% own $365,000.
NPR reported that when politicians reference the stock market as a measure of economic success, that success is not relevant to nearly half of Americans. Further, more than one-third of Americans who work full-time have no access to pensions or retirement accounts such as 401(k)s that derive their value from financial assets like stocks and bonds.
The NYT reported that the percentage of workers covered by generous defined-benefit pension plans has declined from 62% in 1983 to 17% by 2016. While some economists consider an increase in the stock market to have a "wealth effect" that increases economic growth, economists like Former Dallas Federal Reserve Bank President Richard Fisher believe those effects are limited.
Causes of wealth inequality:
Main article: Causes of income inequality in the United States
Essentially, the wealthy possess greater financial opportunities that allow their money to make more money. Earnings from the stock market or mutual funds are reinvested to produce a larger return. Over time, the sum that is invested becomes progressively more substantial.
Those who are not wealthy, however, do not have the resources to enhance their opportunities and improve their economic position. Rather, "after debt payments, poor families are constrained to spend the remaining income on items that will not produce wealth and will depreciate over time."
Scholar David B. Grusky notes that "62 percent of households headed by single parents are without savings or other financial assets." Net indebtedness generally prevents the poor from having any opportunity to accumulate wealth and thereby better their conditions.
Economic inequality is a result of difference in income. Factors that contribute to this gap in wages are things such as level of education, labor market demand and supply, gender differences, growth in technology, and personal abilities. The quality and level of education that a person has often corresponds to their skill level, which is justified by their income.
Wages are also determined by the "market price of a skill" at that current time.
Although gender inequality is a separate social issue, it plays a role in economic inequality. According to the U.S. Census Report, in America the median full-time salary for women is 77 percent of that for men.
Also contributing to the wealth inequality in the U.S., both unskilled and skilled workers are being replaced by machinery. The Seven Pillars Institute for Global Finance and Ethics argues that because of this "technological advance", the income gap between workers and owners has widened.
Income inequality contributes to wealth inequality. For example, economist Emmanuel Saez wrote in June 2016 that the top 1% of families captured 52% of the total real income (GDP) growth per family from 2009-2015. From 2009 to 2012, the top 1% captured 91% of the income gains.
Notably, for both the wealthy and not-wealthy, the process of accumulation or debt is cyclical. The rich use their money to earn larger returns and the poor have no savings with which to produce returns or eliminate debt. Unlike income, both facets are generational.
Wealthy families pass down their assets allowing future generations to develop even more wealth. The poor, on the other hand, are less able to leave inheritances to their children leaving the latter with little or no wealth on which to build...This is another reason why wealth inequality is so important, its accumulation has direct implications for economic inequality among the children of today's families.
Corresponding to financial resources, the wealthy strategically organize their money so that it will produce profit. Affluent people are more likely to allocate their money to financial assets such as stocks, bonds, and other investments which hold the possibility of capital appreciation.
Those who are not wealthy are more likely to have their money in savings accounts and home ownership. This difference comprises the largest reason for the continuation of wealth inequality in America: the rich are accumulating more assets while the middle and working classes are just getting by.
As of 2007, the richest 1% held about 38% of all privately held wealth in the United States. while the bottom 90% held 73.2% of all debt. According to The New York Times, the richest 1 percent in the United States now own more wealth than the bottom 90 percent.
However, other studies argue that higher average savings rate will contribute to the reduction of the share of wealth owned by the rich. The reason is that the rich in wealth are not necessarily the individuals with the highest income. Therefore, the relative wealth share of poorer Quantile of the population would increase if the savings rate of income is very large, although the absolute difference from the wealthiest will increase.
As the price of commodities increases because of inflation, a larger percentage of lower-class people's money is spent on things they need to survive and go to work, such as food and gasoline. Most of the working poor are paid fixed hourly wages that do not keep up with rises in prices, so every year an increasing percentage of their income is consumed until they have to go into debt just to survive. At this point, their little wealth is owed to lenders and banking institutions.
The nature of tax policies in America has been suggested by economists and politicians such as Emmanuel Saez, Thomas Piketty, and Barack Obama to perpetuate economic inequality in America by steering large sums of wealth into the hands of the wealthiest Americans. The mechanism for this is that when the wealthy avoid paying taxes, wealth concentrates to their coffers and the poor go into debt.
The economist Joseph Stiglitz argues that "Strong unions have helped to reduce inequality, whereas weaker unions have made it easier for CEOs, sometimes working with market forces that they have helped shape, to increase it." The long fall in unionization in the U.S. since WWII has seen a corresponding rise in the inequality of wealth and income.
Racial disparities:
The wealth gap between white and black families nearly tripled from $85,000 in 1984 to $236,500 in 2009.
There are many causes, including years of home ownership, household income, unemployment, and education, but inheritance might be the most important. Inheritance can directly link the disadvantaged economic position and prospects of today's blacks to the disadvantaged positions of their parents' and grandparents' generations.
According to a report done by Robert B. Avery and Michael S. Rendall, "one in three white households will receive a substantial inheritance during their lifetime compared to only one in ten black households."
This relative lack of inheritance that has been observed among African Americans can be attributed in large part to factors such as unpaid labor (slavery), violent destruction of personal property in incidents such as Red Summer of 1919, unequal opportunity in education and employment (racial discrimination), and more recent policies such as redlining and planned shrinkage.
Other ethnic minorities, particularly those with darker complexions, have at times faced many of these same adversities to various degrees.
The article "America's Financial Divide" added context to racial wealth inequality stating "…nearly 96.1 percent of the 1.2 million households in the top one percent by income were white, a total of about 1,150,000 households. In addition, these families were found to have a median net asset worth of $8.3 million. In stark contrast, in the same piece, black households were shown as a mere 1.4 percent of the top one percent by income, that's only 16,800 homes.
In addition, their median net asset worth was just $1.2 million. Using this data as an indicator only several thousand of the over 14 million African American households have more than $1.2 million in net assets… Relying on data from Credit Suisse and Brandeis University's Institute on Assets and Social Policy, the Harvard Business Review in the article "How America's Wealthiest Black Families Invest Money" recently took the analysis above a step further.
In the piece the author stated "If you're white and have a net worth of about $356,000, that's good enough to put you in the 72nd percentile of white families. If you're black, it's good enough to catapult you into the 95th percentile." This means 28 percent of the total 83 million white homes, or over 23 million white households, have more than $356,000 in net assets. While only 700,000 of the 14 million black homes have more than $356,000 in total net worth."
According to Inequality.org, the median black family is actually only worth $1,700 when you deduct these durables. In contrast, the median white family holds $116,800 of wealth using the same accounting methods.
Some historical context: In South Africa, during the atrocities of apartheid, the median black family held about 7 percent of typical white South African family net worth. Today, using Wolff’s analysis, the median African American family holds a mere 1.5 percent of median white American family wealth.
A recent piece on Eurweb/Electronic Urban Report "Black Wealth Hardly Exists, Even When You Include NBA, NFL and Rap Stars" stated this about the difference between black middle class families and white middle class families. "Going even further into the data, a recent study by the Institute for Policy Studies (IPS) and the Corporation For Economic Development (CFED) found that it would take 228 years for the average black family to amass the same level of wealth the average white family holds today in 2016.
All while white families create even more wealth over those same two hundred years. In fact, this is a gap that will never close if America stays on its current economic path. According to the Institute on Assets and Social Policy, for each dollar of increase in average income an African American household saw from 1984 to 2009 just $0.69 in additional wealth was generated, compared with the same dollar in increased income creating an additional $5.19 in wealth for a similarly situated white household."
Author Lilian Singh wrote on why the perceptions about black life created by media are misleading in the American Prospect piece "Black Wealth On TV: Realities Don’t Match Perceptions". "Black programming features TV shows that collectively create false perceptions of wealth for African-American families. The images displayed are in stark contrast to the economic conditions the average black family is battling each day."
In an article on Huffington Post by Antonio Moore "The Decadent Veil: Black America's Wealth Illusion" the question of inequity is taken another critical step forward and the piece digs into how celebrity is masking this massive inequality.
Excerpt:
- "The decadent veil looks at black Americans through a lens of group theory and seeks to explain an illusion that has taken form over a 30-year span of financial deregulation and new found access to unsecured credit. This veil is trimmed with million-dollar sports contracts, Roc Nation tour deals and designer labels made for heads of state.
- As black celebrity invited us into their homes through shows like MTV cribs, we forgot the condition of overall African American financial affairs. Despite a large section of the 14 million black households drowning in poverty and debt the stories of a few are told as if they represent those of millions, not thousands.
- It is this new veil of economics that has allowed for a broad swath of America to become not just desensitized to black poverty, but also hypnotized by black celebrity… The decadent veil not only warps the black community's vision outward to a larger economic world, but it also distorts outside community's view of Black America's actual financial reality."
According to an article by the Pew research Center, the median wealth of non-Hispanic black households fell nearly 38% from 2010 to 2013. During that time, the median wealth of those households fell from $16,600 to $13,700. The median wealth of Hispanic families fell 14.3 % as well, from $16,000 to $14,000.
Despite the median net worth of all households in the United States decreasing with time, as of 2013, white households had a median net worth of $141,900 while black house households had a median net worth of just $11,000.
Hispanic households had a median net worth of just $13,700 over that time as well.
Effect on democracy:
See also: Income inequality in the United States § Effects on democracy and society
A 2014 study by researchers at Princeton and Northwestern concludes that government policies reflect the desires of the wealthy, and that the vast majority of American citizens have "minuscule, near-zero, statistically non-significant impact upon public policy … when a majority of citizens disagrees with economic elites and/or with organized interests, they generally lose."
When Fed chair Janet Yellen was questioned by Bernie Sanders about the study at a congressional hearing in May 2014, she responded "There’s no question that we’ve had a trend toward growing inequality" and that this trend "can shape [and] determine the ability of different groups to participate equally in a democracy and have grave effects on social stability over time."
In Capital in the Twenty-First Century, French economist Thomas Piketty argues that "extremely high levels" of wealth inequality are "incompatible with the meritocratic values and principles of social justice fundamental to modern democratic societies" and that "the risk of a drift towards oligarchy is real and gives little reason for optimism about where the United States is headed."
According to Jedediah Purdy, a researcher at the Duke School of Law, the inequality of wealth in the United States has constantly opened the eyes of the many problems and shortcomings of its financial system over at least the last fifty years of the debate. For years, people believed that distributive justice would produce a sustainable level of wealth inequality. It was also thought that a certain state would be able to effectively diminish the amount of inequality that would occur.
Something that was for the most part not expected is the fact that the inequality levels created by the growing markets would lessen the power of that state and prevent the majority of the political community from actually being able to deliver on its plans of distributive justice, however it has just lately come to attention of the mass majority.
Effect on health and well being:
The 2019 World Happiness Report shows the US slipping to 19th place due to increasing wealth inequality, along with rising healthcare costs, surging addiction rates, and an unhealthy work–life balance.
Proposals to reduce wealth inequality:
Taxation of wealth:
Senator Elizabeth Warren proposed an annual tax on wealth in January 2019, specifically a 2% tax for wealth over $50 million and another 1% surcharge on wealth over $1 billion.
Wealth is defined as including all asset classes, including financial assets and real estate.
Economists Emmanuel Saez and Gabriel Zucman estimated that about 75,000 households (less than 0.1%) would pay the tax. The tax would raise around $2.75 trillion over 10 years, roughly 1% GDP on average per year. This would raise the total tax burden for those subject to the wealth tax from 3.2% of their wealth under current law to about 4.3% on average, versus the 7.2% for the bottom 99% families.
For scale, the federal budget deficit in 2018 was 3.9% GDP and is expected to rise towards 5% GDP over the next decade. The plan received both praise and criticism. Two billionaires, Michael Bloomberg and Howard Schultz, criticized the proposal as "unconstitutional" and "ridiculous," respectively. Warren was not surprised by this reaction, stating: "Another billionaire who thinks that billionaires shouldn't pay more in taxes." Economist Paul Krugman wrote in January 2019 that polls indicate the idea of taxing the rich more is very popular.
Limit or tax stock buybacks:
Senators Charles Schumer and Bernie Sanders advocated limiting stock buybacks in January 2019. They explained that from 2008-2017, 466 of the S&P 500 companies spent $4 trillion on stock buybacks, about 50% of profits, with another 40% going to dividends.
During 2018 alone, a record $1 trillion was spent on buybacks. Stock buybacks shift wealth upwards, because the top 1% own about 40% of shares and the top 10% own about 85%.
Further, corporations directing profits to shareholders are not reinvesting the money in the firm or paying workers more. They wrote: "If corporations continue to purchase their own stock at this rate, income disparities will continue to grow, productivity will suffer, the long-term strength of companies will diminish — and the American worker will fall further behind."
Their proposed legislation would prohibit buybacks unless the corporation has taken other steps first, such as paying workers more, providing more benefits such as healthcare and pensions, and investing in the community. To prevent corporations from shifting from buybacks to dividends, they proposed limiting dividends, perhaps by taking action through the tax code.
See also:
- Wealth Inequality in the United States Since 1913: Evidence from Capitalized Income Tax Data. (Emmanuel Saez and Gabriel Zucman, National Bureau of Economic Research - October 2014.)
- State of the Union: Essential Inequality Reader. (Moyers & Company - December 23, 2013).
- Americans Underestimate U.S. Wealth Inequality (Audio - NPR).
- The United States of Inequality (10-part Visual Guide - Slate magazine).
- 15 Mind-Blowing Facts About Wealth And Inequality In America (Charts - The Business Insider).
- America's Disappearing Middle Class: Implications for Public Policy and Politics (Trevor Beltz - May, 2012).
- Wealth Inequality in America (Video).
- Nine Charts about Wealth Inequality in America (April 2015), The Urban Institute
- What Happened to America’s Wealth? The Rich Hid It. Moyers & Company. July 7, 2017
- Wealth Inequality Is Higher Than Ever. Jacobin. October 1, 2017.
- Affluence in the United States
- Distribution of wealth in Europe
- Citizens United v. Federal Election Commission
- Donor Class
- Income inequality in the United States
- Monetary policy
- Net worth
- Occupy movement
- Occupy Wall Street
- Oligarchy
- Panama Papers
- Paradise Papers
- Pareto principle
- Plutocracy
- Power elite
- Redistribution of wealth
- Tax Policy and Economic Inequality in the United States
- The Divide: American Injustice in the Age of the Wealth Gap
- Wealth concentration
- Wealth in the United States
- We are the 99%
- American upper class
- List of Americans by net worth
Causes of income inequality in the United States describes why changes in the country's income distribution are occurring. This topic is subject to extensive ongoing research, media attention, and political interest, as it involves how the national income of the country is split among its people at various income levels.
Overview:
Income inequality in the United States (previous topic) has grown significantly since the early 1970s, after several decades of stability, and has been the subject of study of many scholars and institutions. The U.S. consistently exhibits higher rates of income inequality than most developed nations, arguably due to the nation's relatively enhanced support of free market capitalism.
According to the CBO and others, "the precise reasons for the [recent] rapid growth in income at the top are not well understood", but "in all likelihood," an "interaction of multiple factors" was involved. "Researchers have offered several potential rationales." Some of these rationales conflict, some overlap. They include:
Paul Krugman put several of these factors into context in January 2015: "Competition from emerging-economy exports has surely been a factor depressing wages in wealthier nations, although probably not the dominant force. More important, soaring incomes at the top were achieved, in large part, by squeezing those below: by cutting wages, slashing benefits, crushing unions, and diverting a rising share of national resources to financial wheeling and dealing ... Perhaps more important still, the wealthy exert a vastly disproportionate effect on policy. And elite priorities — obsessive concern with budget deficits, with the supposed need to slash social programs — have done a lot to deepen [wage stagnation and income inequality]."
Divergence of productivity and compensation:
Overall:
One view of economic equity is that employee compensation should rise with productivity (defined as real output per hour of labor worked). In other words, if the employee produces more, they should be paid accordingly. If pay lags behind productivity, income inequality grows, as labor's share of the output is falling, while capital's share (generally higher-income owners) is rising.
According to a June 2017 report from the non-partisan Bureau of Labor Statistics (BLS), productivity rose in tandem with employee compensation (a measure which includes wages as well as benefits such as health insurance) from the 1940s through the 1970s. However, since then productivity has grown faster than compensation.
BLS refers to this as the "productivity-compensation gap", an issue which has garnered much attention from academics and policymakers. BLS reported this gap occurs across most industries: "When examined at a detailed industry level, the average annual percent change in productivity outpaced compensation in 83 percent of 183 industries studied" measured from 1987-2015.
For example, in the information industry, productivity increased at an annual average rate of 5.0% over the 1987-2015 period, while compensation increased at about a 1.5% rate, resulting in a 3.5% productivity gap. In Manufacturing, the gap was 2.7%; in Retail Trade 2.6%; and in Transportation and Warehousing 1.3%. This analysis adjusted for inflation using the Consumer Price Index or CPI, a measure of inflation based on what is consumed, rather than what is produced.
Analyzing the gap:
BLS explained the gap between productivity and compensation can be divided into two components, the effect of which varies by industry: 1) Recalculating the gap using an industry-specific inflation adjustment ("industry deflator") rather than consumption (CPI); and 2) The change in labor's share of income, defined as how much of a business' revenue goes to workers as opposed to intermediate purchases (i.e., cost of goods) and capital (owners) in that industry.
The difference in deflators was the stronger effect among high productivity growth industries, while the change in labor's share of income was the stronger effect among most other industries. For example, the 3.5% productivity gap in the information industry was composed of a 2.1% difference in deflators and about a 1.4% due to change in labor share.
The 2.7% gap in Manufacturing included 1.0% due to deflation and 1.7% due to change in labor share.
Reasons for the gap:
BLS explained the decline in labor share as likely driven by three factors that vary by industry:
Market Factors:
Globalization:
Globalization refers to the integration of economies in terms of trade, information, and jobs. Innovations in supply chain management enabled goods to be sourced in Asia and shipped to the United States less expensively than in the past. This integration of economies, particularly with the U.S. and Asia, had dramatic impacts on income inequality globally.
Economist Branko Milanovic analyzed global income inequality, comparing 1988 and 2008. His analysis indicated that the global top 1% and the middle classes of the emerging economies (e.g., China, India, Indonesia, Brazil and Egypt) were the main winners of globalization during that time.
The real (inflation adjusted) income of the global top 1% increased approximately 60%, while the middle classes of the emerging economies (those around the 50th percentile of the global income distribution in 1988) rose 70–80%.
For example, in 2000, 5 million Chinese households earned between $11,500 and $43,000 in 2016 dollars. By 2015, 225 million did. On the other hand, those in the middle class of the developed world (those in the 75th to 90th percentile in 1988, such as the American middle class) experienced little real income gains.
The richest 1% contains 60 million persons globally, including 30 million Americans (i.e., the top 12% of Americans by income were in the global top 1% in 2008), the most out of any country.
While economists who have studied globalization agree imports have had an effect, the timing of import growth does not match the growth of income inequality. By 1995 imports of manufactured goods from low-wage countries totalled less than 3% of US gross domestic product.
It wasn't until 2006 that the US imported more manufactured goods from low-wage (developing) countries than from high-wage (advanced) economies. Inequality increased during the 2000–2010 decade not because of stagnating wages for less-skilled workers, but because of accelerating incomes of the top 0.1%. Author Timothy Noah estimates that "trade", increases in imports are responsible for just 10% of the "Great Divergence" in income distribution.
Journalist James Surowiecki notes that in the last 50 years, companies and the sectors of the economy providing the most employment in the US – major retailers, restaurant chains, and supermarkets – are ones with lower profit margins and less pricing power than in the 1960s; while sectors with high profit margins and average salaries – like high technology – have relatively few employees.
Some economists claim that it is WTO-led globalization and competition from developing countries, especially China, that has resulted in the recent decline in labor's share of income and increased unemployment in the U.S. And the Economic Policy Institute and the Center for Economic and Policy Research argue that some trade agreements such as the Trans-Pacific Partnership could result in further job losses and declining wages.
One argument contrary to the globalization/technology hypothesis relates to variation across countries. Japan, Sweden and France did not experience significant increases in income inequality during the 1979–2010 period, although the U.S. did.
The top 1% income group continued to receive less than 10% of the income share in these countries, while the U.S. share rose from 10% to over 20%. Economist Emmanuel Saez wrote in 2014: "Differences across countries rule out technical change/globalization as the sole explanation ... Policies play a key role in shaping inequality (tax and transfer policies, regulations, education)."
Superstar hypothesis:
Eric Posner and Glen Weyl point out that inequality can be predominantly explained by the superstar hypothesis. In their opinion Piketty fails to observe the accelerated turnover that is occurring in the Forbes 400; only 35 people from the original 1982 list remain today. Many have fallen off as a result of heavy spending, large-scale philanthropy, and bad investments.
The current Forbes 400 is now primarily made up of newly wealthy business owners, not heirs and heiresses. In parallel research, the University of Chicago's Steven Kaplan and Stanford University's Joshua Rauh note that 69% of those on the Forbes list are actually first generation wealth creators. That figure has risen dramatically since 1982 when it stood at 40%.
Ed Dolan supports the globalization and superstar hypothesis but points out that the high earnings are based, to some extent, on moral hazard like "Bonus-based compensation schemes with inadequate clawback for losses" and the shift of losses to shareholders, unsecured creditors, or taxpayers.
Paul Krugman argues that for the US the surge in inequality to date is mainly due to supersalaries but capital has nonetheless been significant too. And when the current generation of the 1% turn over their wealth to their heirs these become rentiers, people who live off accumulated capital. Two decades from now America could turn into a rentier-dominated society even more unequal than Belle Époque Europe.
One study extended the superstar hypothesis to corporations, with firms that are more dominant in their industry (in some cases due to oligopoly or monopoly) paying their workers far more than the average in the industry. Another study noted that "superstar firms" is another explanation for the decline in the overall share of income (GDP) going to workers/labor as opposed to owners/capital.
Education:
Main article: Educational attainment in the United States
Income differences between the varying levels of educational attainment (usually measured by the highest degree of education an individual has completed) have increased. Expertise and skill certified through an academic degree translates into increased scarcity of an individual's occupational qualification which in turn leads to greater economic rewards.
As the United States has developed into a post-industrial society more and more employers require expertise that they did not a generation ago, while the manufacturing sector which employed many of those lacking a post-secondary education is decreasing in size.
In the resulting economic job market the income discrepancy between the working class and the professional with the higher academic degrees, who possess scarce amounts of certified expertise, may be growing.
Households in the upper quintiles are generally home to more, better educated and employed working income earners, than those in lower quintiles.
Among those in the upper quintile, 62% of householders were college graduates, 80% worked full-time and 76% of households had two or more income earners, compared to the national percentages of 27%, 58% and 42%, respectively. Upper-most sphere US Census Bureau data indicated that occupational achievement and the possession of scarce skills correlates with higher income.
Average earnings in 2002 for the population 18 years and over were higher at each progressively higher level of education ... This relationship holds true not only for the entire population but also across most subgroups. Within each specific educational level, earnings differed by sex and race. This variation may result from a variety of factors, such as occupation, working full- or part-time, age, or labor force experience.
The "college premium" refers to the increase in income to workers with four-year college degrees relative to those without. The college premium doubled from 1980 to 2005, as the demand for college-educated workers has exceeded the supply.
Economists Goldin and Katz estimate that the increase in economic returns to education was responsible for about 60% of the increase in wage inequality between 1973 and 2005. The supply of available graduates did not keep up with business demand due primarily to increasingly expensive college educations.
Annual tuition at public and private universities averaged 4% and 20% respectively of the annual median family income from the 1950s to 1970s; by 2005 these figures were 10% and 45% as colleges raised prices in response to demand.
Economist David Autor wrote in 2014 that approximately two-thirds of the rise in income inequality between 1980 and 2005 was accounted for by the increased premium associated with education in general and post-secondary education in particular.
Two researchers have suggested that children in low income families are exposed to 636 words an hour, as opposed to 2,153 words in high income families during the first four formative years of a child's development. This, in turn, led to low achievement in later schooling due to the inability of the low income group to verbalize concepts.
A psychologist has stated that society stigmatizes poverty. Conversely, poor people tend to believe that the wealthy have been lucky or have earned their money through illegal means. She believes that both attitudes need to be discarded if the nation is to make headway in addressing the issue of inequality. She suggests that college not be a litmus test of success; that valorizing of one profession as more important than another is a problem.
Skill-biased technological change:
As of the mid- to late- decade of the 2000s, the most common explanation for income inequality in America was "skill-biased technological change" (SBTC) – "a shift in the production technology that favors skilled over unskilled labor by increasing its relative productivity and, therefore, its relative demand".
For example, one scholarly colloquium on the subject that included many prominent labor economists estimated that technological change was responsible for over 40% of the increase in inequality.
Other factors like international trade, decline in real minimum wage, decline in unionization and rising immigration, were each responsible for 10–15% of the increase.
Education has a notable influence on income distribution. In 2005, roughly 55% of income earners with doctorate degrees – the most educated 1.4% – were among the top 15 percent earners. Among those with Master's degrees – the most educated 10% – roughly half had incomes among the top 20 percent of earners. Only among households in the top quintile were householders with college degrees in the majority.
But while the higher education commonly translates into higher income, and the highly educated are disproportionately represented in upper quintile households, differences in educational attainment fail to explain income discrepancies between the top 1 percent and the rest of the population.
Large percentages of individuals lacking a college degree are present in all income demographics, including 33% of those with heading households with six figure incomes. From 2000 to 2010, the 1.5% of Americans with an M.D., J.D., or M.B.A. and the 1.5% with a PhD saw median income gains of approximately 5%.
Among those with a college or master's degree (about 25% of the American workforce) average wages dropped by about 7%, (though this was less than the decline in wages for those who had not completed college). Post-2000 data has provided "little evidence" for SBTC's role in increasing inequality. The wage premium for college educated has risen little and there has been little shift in shares of employment to more highly skilled occupations.
Approaching the issue from occupations that have been replaced or downgraded since the late 1970s, one scholar found that jobs that "require some thinking but not a lot" – or moderately skilled middle-class occupations such as cashiers, typists, welders, farmers, appliance repairmen – declined the furthest in wage rates and/or numbers. Employment requiring either more skill or less has been less affected.
However the timing of the great technological change of the era – internet use by business starting in the late 1990s – does not match that of the growth of income inequality (starting in the early 1970s but slackening somewhat in the 1990s).
Nor does the introduction of technologies that increase the demand for more skilled workers seem to be generally associated with a divergence in household income among the population. Inventions of the 20th century such as AC electric power, the automobile, airplane, radio, television, the washing machine, Xerox machine, each had an economic impact similar to computers, microprocessors and internet, but did not coincide with greater inequality.
Another explanation is that the combination of the introduction of technologies that increase the demand for skilled workers, and the failure of the American education system to provide a sufficient increase in those skilled workers has bid up those workers' salaries.
An example of the slowdown in education growth in America (that began about the same time as the Great Divergence began) is the fact that the average person born in 1945 received two more years of schooling than his parents, while the average person born in 1975 received only half a year more of schooling.
Author Timothy Noah's "back-of-the-envelope" estimation based on "composite of my discussions with and reading of the various economists and political scientists" is that the "various failures" in America's education system are "responsible for 30%" of the post-1978 increase in inequality.
Race and Gender Disparities:
Further information: Gender pay gap in the United States and Racial wage gap in the United States
Income levels vary by gender and race with median income levels considerably below the national median for females compared to men with certain racial demographics.
Despite considerable progress in pursuing gender and racial equality, some social scientists like Richard Schaeffer attribute these discrepancies in income partly to continued discrimination.
Among women, part of the wage gap is due to employment choices and preferences. Women are more likely to consider factors other than salary when looking for employment. On average, women are less willing to travel or relocate, take more hours off and work fewer hours, and choose college majors that lead to lower paying jobs. Women are also more likely to work for governments or non-profits which pay less than the private sector.
According to this perspective certain ethnic minorities and women receive fewer promotions and opportunities for occupation and economic advancement than others. In the case of women this concept is referred to as the glass ceiling keeping women from climbing the occupational ladder.
In terms of race, Asian Americans are far more likely to be in the highest earning 5 percent than the rest of Americans. Studies have shown that African Americans are less likely to be hired than White Americans with the same qualifications.
The continued prevalence of traditional gender roles and ethnic stereotypes may partially account for current levels of discrimination. In 2005, median income levels were highest among Asian and White males and lowest among females of all races, especially those identifying as African American or Hispanic.
Despite closing gender and racial gaps, considerable discrepancies remain among racial and gender demographics, even at the same level of educational attainment. The economic success of Asian Americans may come from how they devote much more time to education than their peers. Asian Americans have significantly higher college graduation rates than their peers and are much more likely to enter high status and high income occupations.
Since 1953 the income gap between male and female workers has decreased considerably but remains relatively large. Women currently earn significantly more Associate's, Bachelor's, and Master's degrees than men and almost as many Doctorates. Women are projected to have passed men in Doctorates earned in 2006–2007, and to earn nearly two thirds of Associate's, Bachelor's, and Master's degrees by 2016.
Though it is important to note that income inequality between sexes remained stark at all levels of educational attainment. Between 1953 and 2005 median earnings as well as educational attainment increased, at a far greater pace for women than for men. Median income for female earners male earners increased 157.2% versus 36.2% for men, over four times as fast.
Today the median male worker earns roughly 68.4% more than their female counterparts, compared to 176.3% in 1953. The median income of men in 2005 was 2% higher than in 1973 compared to a 74.6% increase for female earners.
Racial differences remained stark as well, with the highest earning sex-gender demographic of workers aged 25 or older, Asian males (who were roughly tied with white males) earning slightly more than twice as much as the lowest-earning demographic, Hispanic females.
As mentioned above, inequality between races and gender persisted at similar education levels. Racial differences were overall more pronounced among male than among female income earners. In 2009, Hispanics were more than twice as likely to be poor than non-Hispanic whites, research indicates.
Lower average English ability, low levels of educational attainment, part-time employment, the youthfulness of Hispanic household heads, and the 2007–09 recession are important factors that have pushed up the Hispanic poverty rate relative to non-Hispanic whites.
During the early 1920s, median earnings decreased for both sexes, not increasing substantially until the late 1990s. Since 1974 the median income for workers of both sexes increased by 31.7% from $18,474 to $24,325, reaching its high-point in 2000.
Incentives:
In the context of concern over income inequality, a number of economists, such as Federal Reserve Chairman Ben Bernanke, have talked about the importance of incentives: "... without the possibility of unequal outcomes tied to differences in effort and skill, the economic incentive for productive behavior would be eliminated, and our market-based economy ... would function far less effectively."
Since abundant supply decreases market value, the possession of scarce skills considerably increases income. Among the American lower class, the most common source of income was not occupation, but government welfare.
Stock buybacks:
Writing in the Harvard Business Review in September 2014, William Lazonick blamed record corporate stock buybacks for reduced investment in the economy and a corresponding impact on prosperity and income inequality.
Between 2003 and 2012, the 449 companies in the S&P 500 used 54% of their earnings ($2.4 trillion) to buy back their own stock. An additional 37% was paid to stockholders as dividends. Together, these were 91% of profits. This left little for investment in productive capabilities or higher income for employees, shifting more income to capital rather than labor.
Lazonick blamed executive compensation arrangements, which are heavily based on stock options, stock awards and bonuses for meeting earnings per share (EPS) targets (EPS increases as the number of outstanding shares decreases). Restrictions on buybacks were greatly eased in the early 1980s. He advocates changing these incentives to limit buybacks.
U.S. companies are projected to increase buybacks to $701 billion in 2015 according to Goldman Sachs, an 18% increase over 2014. For scale, annual non-residential fixed investment (a proxy for business investment and a major GDP component) was estimated to be about $2.1 trillion for 2014.
Journalist Timothy Noah wrote in 2012 that: "My own preferred hypothesis is that stockholders appropriated what once belonged to middle-class wage earners." Since the vast majority of stocks are owned by higher income households, this contributes to income inequality.
Journalist Harold Meyerson wrote in 2014 that: "The purpose of the modern U.S. corporation is to reward large investors and top executives with income that once was spent on expansion, research, training and employees."
Tax and transfer policies:
Main article: Tax policy and economic inequality in the United States
Background:
U.S. income inequality is comparable to other developed nations pre-tax, but is among the worst after-tax and transfers. This indicates the U.S. tax policies redistribute income from higher income to lower income households relatively less than other developed countries.
Journalist Timothy Noah summarized the results of several studies his 2012 book The Great Divergence:
Income taxes:
A key factor in income inequality/equality is the effective rate at which income is taxed coupled with the progressivity of the tax system. A progressive tax is a tax in which the effective tax rate increases as the taxable base amount increases. Overall income tax rates in the U.S. are below the OECD average, and until 2005 have been declining.
How much tax policy change over the last thirty years has contributed to income inequality is disputed. In their comprehensive 2011 study of income inequality (Trends in the Distribution of Household Income Between 1979 and 2007), the CBO found that the top fifth of the population saw a 10-percentage-point increase in their share of after-tax income.
Most of that growth went to the top 1 percent of the population. All other groups saw their shares decline by 2 to 3 percentage points. In 2007, federal taxes and transfers reduced the dispersion of income by 20 percent, but that equalizing effect was larger in 1979. The share of transfer payments to the lowest-income households declined. The overall average federal tax rate fell.
However, a more recent CBO analysis indicates that with changes to 2013 tax law (e.g., the expiration of the 2001-2003 Bush tax cuts for top earners and the increased payroll taxes passed as part of the Affordable Care Act), the effective federal tax rates for the highest earning household will increase to levels not seen since 1979.
According to journalist Timothy Noah, "you can't really demonstrate that U.S. tax policy had a large impact on the three-decade income inequality trend one way or the other. The inequality trend for pre-tax income during this period was much more dramatic." Noah estimates tax changes account for 5% of the Great Divergence.
But many – such as economist Paul Krugman – emphasize the effect of changes in taxation – such as the 2001 and 2003 Bush administration tax cuts which cut taxes far more for high-income households than those below – on increased income inequality.
Part of the growth of income inequality under Republican administrations (described by Larry Bartels) has been attributed to tax policy. A study by Thomas Piketty and Emmanuel Saez found that "large reductions in tax progressivity since the 1960s took place primarily during two periods: the Reagan presidency in the 1980s and the Bush administration in the early 2000s."
During Republican President Ronald Reagan's tenure in office the top marginal income tax rate was reduced from over 70 to 28 percent, high top marginal rates like 70% being the sort in place during much of the period of great income equality following the "Great Compression".
The lowest marginal rate for the bottom fell from 14 to 11 percent. However the effective rate on top earners before Reagan's tax cut was much lower because of loopholes and charitable contributions.
Taxes on capital:
Taxes on income derived from capital (e.g., financial assets, property and businesses) primarily affect higher income groups, who own the vast majority of capital. For example, in 2010 approximately 81% of stocks were owned by the top 10% income group and 69% by the top 5%.
Only about one-third of American households have stock holdings more than $7,000. Therefore, since higher-income taxpayers have a much higher share of their income represented by capital gains, lowering taxes on capital income and gains increases after-tax income inequality.
Capital gains taxes were reduced around the time income inequality began to rise again around 1980 and several times thereafter. During 1978 under President Carter, the top capital gains tax rate was reduced from 49% to 28%. President Ronald Reagan's 1981 cut in the top rate on unearned income reduced the maximum capital gains rate to only 20% – its lowest level since the Hoover administration, as part of an overall economic growth strategy.
The capital gains tax rate was also reduced by President Bill Clinton in 1997, from 28% to 20%. President George W. Bush reduced the tax rate on capital gains and qualifying dividends from 20% to 15%, less than half the 35% top rate on ordinary income.
CBO reported in August 1990 that: "Of the 8 studies reviewed, five, including the two CBO studies, found that cutting taxes on capital gains is not likely to increase savings, investment, or GNP much if at all." Some of the studies indicated the loss in revenue from lowering the tax rate may be offset by higher economic growth, others did not.
Journalist Timothy Noah wrote in 2012 that: "Every one of these changes elevated the financial interests of business owners and stockholders above the well-being, financial or otherwise, or ordinary citizens." So overall, while cutting capital gains taxes adversely affects income inequality, its economic benefits are debatable.
Other tax policies:
Rising inequality had also been attributed to President Bush's veto of tax harmonization, as this would have prohibited offshore tax havens.
Debate over effects of tax policies:
One study found reductions of total effective tax rates were most significant for individuals with highest incomes. (see "Federal Tax Rate by Income Group" chart) For those with incomes in the top 0.01 percent, overall rates of Federal tax fell from 74.6% in 1970, to 34.7% in 2004 (the reversal of the trend in 2000 with a rise to 40.8% came after the 1993 Clinton deficit reduction tax bill), the next 0.09 percent falling from 59.1% to 34.1%, before leveling off with a relatively modest drop of 41.4 to 33.0% for the 99.5–99.9 percent group.
Although the tax rate for low-income earners fell as well (though not as much), these tax reductions compare with virtually no change – 23.3% tax rate in 1970, 23.4% in 2004 – for the US population overall.
The study found the decline in progressivity since 1960 was due to the shift from allocation of corporate income taxes among labor and capital to the effects of the individual income tax. Paul Krugman also supports this claim saying, "The overall tax rate on these high income families fell from 36.5% in 1980 to 26.7% in 1989."
From the White House's own analysis, the federal tax burden for those making greater than $250,000 fell considerably during the late 1980s, 1990s and 2000s, from an effective tax of 35% in 1980, down to under 30% from the late 1980s to 2011.
Many studies argue that tax changes of S corporations confound the statistics prior to 1990. However, even after these changes inflation-adjusted average after-tax income grew by 25% between 1996 and 2006 (the last year for which individual income tax data is publicly available).
This average increase, however, obscures a great deal of variation. The poorest 20% of tax filers experienced a 6% reduction in income while the top 0.1 percent of tax filers saw their income almost double. Tax filers in the middle of the income distribution experienced about a 10% increase in income. Also during this period, the proportion of income from capital increased for the top 0.1 percent from 64% to 70%.
Transfer payments:
Transfer payments refer to payments to persons such as social security, unemployment compensation, or welfare. CBO reported in November 2014 that: "Government transfers reduce income inequality because the transfers received by lower-income households are larger relative to their market income than are the transfers received by higher-income households.
Federal taxes also reduce income inequality, because the taxes paid by higher-income households are larger relative to their before-tax income than are the taxes paid by lower-income households. The equalizing effects of government transfers were significantly larger than the equalizing effects of federal taxes from 1979 to 2011.
CBO also reported that less progressive tax and transfer policies have contributed to greater after-tax income inequality: "As a result of the diminishing effect of transfers and federal taxes, the Gini index for income after transfers and federal taxes grew by more than the index for market income.
Between 1979 and 2007, the Gini index for market income increased by 23 percent, the index for market income after transfers increased by 29 percent, and the index for income measured after transfers and federal taxes increased by 33 percent."
Tax expenditures:
Tax expenditures (i.e., exclusions, deductions, preferential tax rates, and tax credits) cause revenues to be much lower than they would otherwise be for any given tax rate structure.
The benefits from tax expenditures, such as income exclusions for healthcare insurance premiums paid for by employers and tax deductions for mortgage interest, are distributed unevenly across the income spectrum. They are often what the Congress offers to special interests in exchange for their support. According to a report from the CBO that analyzed the 2013 data:
Understanding how each tax expenditure is distributed across the income spectrum can inform policy choices.
Other Causes:
Shifts in political power:
Paul Krugman wrote in 2015 that: "Economists struggling to make sense of economic polarization are, increasingly, talking not about technology but about power."
This market power hypothesis basically asserts that market power has concentrated in monopolies and oligopolies that enable unusual amounts of income ("rents") to be transferred from the many consumers to relatively few owners. This hypothesis is consistent with higher corporate profits without a commensurate rise in investment, as firms facing less competition choose to pass a greater share of their profits to shareholders (such as through share buybacks and dividends) rather than re-invest in the business to ward off competitors.
One cause of this concentration of market power was the rightward shift in American politics toward more conservative policies since 1980, as politics plays a big role in how market power can be exercised.
Policies that removed barriers to monopoly and oligopoly included anti-union laws, reduced anti-trust activity, deregulation (or failure to regulate) non-depository banking, contract laws that favored creditors over debtors, etc. Further, rising wealth concentration can be used to purchase political influence, creating a feedback loop.
Decline of unions:
Further information: Labor unions in the United States
The era of inequality growth has coincided with a dramatic decline in labor union membership from 20% of the labor force in 1983 to about 12% in 2007. Classical and neoclassical economists have traditionally thought that since the chief purpose of a union is to maximize the income of its members, a strong but not all-encompassing union movement would lead to increased income inequality.
However, given the increase in income inequality of the past few decades, either the sign of the effect must be reversed, or the magnitude of the effect must be small and a much larger opposing force has overridden it.
However, more recently, research has shown that unions' ability to reduce income disparities among members outweighed other factors and its net effect has been to reduce national income inequality.
The decline of unions has hurt this leveling effect among men, and one economist (Berkeley economist David Card) estimating about 15–20% of the "Great Divergence" among that gender is the result of declining unionization.
According to scholars, "As organized labor's political power dissipates, economic interests in the labor market are dispersed and policy makers have fewer incentives to strengthen unions or otherwise equalize economic rewards."
Unions were a balancing force, helping ensure wages kept up with productivity and that neither executives nor shareholders were unduly rewarded. Further, societal norms placed constraints on executive pay.
This changed as union power declined (the share of unionized workers fell significantly during the Great Divergence, from over 30% to around 12%) and CEO pay skyrocketed (rising from around 40 times the average workers pay in the 1970s to over 350 times in the early 2000s).
A 2015 report by the International Monetary Fund also attributes the decline of labor's share of GDP to de-unionization, noting the trend "necessarily increases the income share of corporate managers' pay and shareholder returns ... Moreover, weaker unions can reduce workers' influence on corporate decisions that benefit top earners, such as the size and structure of top executive compensation."
Still other researchers think it is the labor movement's loss of national political power to promote equalizing "government intervention and changes in private sector behavior" has had the greatest impact on inequality in the US. Sociologist Jake Rosenfeld of the University of Washington argues that labor unions were the primary institution fighting inequality in the United States and helped grow a multiethnic middle class, and their decline has resulted in diminishing prospects for U.S. workers and their families.
Timothy Noah estimates the "decline" of labor union power "responsible for 20%" of the Great Divergence. While the decline of union power in the US has been a factor in declining middle class incomes, they have retained their clout in Western Europe.
In Denmark, influential trade unions such as Fagligt Fælles Forbund (3F) ensure that fast-food workers earn a living wage, the equivalent of $20 an hour, which is more than double the hourly rate for their counterparts in the United States.
Critics of technological change as an explanation for the "Great Divergence" of income levels in America point to public policy and party politics, or "stuff the government did, or didn't do". They argue these have led to a trend of declining labor union membership rates and resulting diminishing political clout, decreased expenditure on social services, and less government redistribution. Moreover, the United States is the only advanced economy without a labor-based political party.
As of 2011, several state legislatures have launched initiatives aimed at lowering wages, labor standards, and workplace protections for both union and non-union workers.
The economist Joseph Stiglitz argues that "Strong unions have helped to reduce inequality, whereas weaker unions have made it easier for CEOs, sometimes working with market forces that they have helped shape, to increase it." The long fall in unionization in the U.S. since WWII has seen a corresponding rise in the inequality of wealth and income.
Political parties and presidents:
Liberal political scientist Larry Bartels has found a strong correlation between the party of the president and income inequality in America since 1948. (see below) Examining average annual pre-tax income growth from 1948 to 2005 (which encompassed most of the egalitarian Great Compression and the entire inegalitarian Great Divergence).
Bartels shows that under Democratic presidents (from Harry Truman forward), the greatest income gains have been at the bottom of the income scale and tapered off as income rose.
Under Republican presidents, in contrast, gains were much less but what growth there was concentrated towards the top, tapering off as you went down the income scale.
Summarizing Bartels's findings, journalist Timothy Noah referred to the administrations of Democratic presidents as "Democrat-world", and GOP administrations as "Republican-world":
In Democrat-world, pre-tax income increased 2.64% annually for the poor and lower-middle-class and 2.12% annually for the upper-middle-class and rich. There was no Great Divergence. Instead, the Great Compression – the egalitarian income trend that prevailed through the 1940s, 1950s, and 1960s – continued to the present, albeit with incomes converging less rapidly than before.
In Republican-world, meanwhile, pre-tax income increased 0.43 percent annually for the poor and lower-middle-class and 1.90 percent for the upper-middle-class and rich. Not only did the Great Divergence occur; it was more greatly divergent. Also of note: In Democrat-world pre-tax income increased faster than in the real world not just for the 20th percentile but also for the 40th, 60th, and 80th.
We were all richer and more equal! But in Republican-world, pre-tax income increased slower than in the real world not just for the 20th percentile but also for the 40th, 60th, and 80th. We were all poorer and less equal! Democrats also produced marginally faster income growth than Republicans at the 95th percentile, but the difference wasn't statistically significant.
The pattern of distribution of growth appears to be the result of a whole host of policies,
including not only the distribution of taxes and benefits but also the government's stance toward unions, whether the minimum wage rises, the extent to which the government frets about inflation versus too-high interest rates, etc., etc.
Noah admits the evidence of this correlation is "circumstantial rather than direct", but so is "the evidence that smoking is a leading cause of lung cancer."
In his 2017 book The Great Leveler, historian Walter Scheidel point out that, starting in the 1970s, both parties shifted towards promoting free market capitalism, with Republicans moving further to the political right than Democrats to the political left. He notes that Democrats have been instrumental in the financial deregulation of the 1990s and have largely neglected social welfare issues while increasingly focusing on issues pertaining to identity politics.
The Clinton Administration in particular continued promoting free market, or neoliberal, reforms which began under the Reagan Administration.
Non-party political action:
Further information: Executive pay in the United States
According to political scientists Jacob Hacker and Paul Pierson writing in the book Winner-Take-All Politics, the important policy shifts were brought on not by the Republican Party but by the development of a modern, efficient political system, especially lobbying, by top earners – and particularly corporate executives and the financial services industry.
The end of the 1970s saw a transformation of American politics away from a focus on the middle class, with new, much more effective, aggressive and well-financed lobbyists and pressure groups acting on behalf of upper income groups. Executives successfully eliminated any countervailing power or oversight of corporate managers (from private litigation, boards of directors and shareholders, the Securities and Exchange Commission or labor unions).
The financial industry's success came from successfully pushing for deregulation of financial markets, allowing much more lucrative but much more risky investments from which it privatized the gains while socializing the losses with government bailouts. (the two groups formed about 60% of the top 0.1 percent of earners.) All top earners were helped by deep cuts in estate and capital gains taxes, and tax rates on high levels of income.
Arguing against the proposition that the explosion in pay for corporate executives – which grew from 35X average worker pay in 1978 to over 250X average pay before the 2007 recession – is driven by an increased demand for scarce talent and set according to performance, Krugman points out that multiple factors outside of executives' control govern corporate profitability, particularly in short term when the head of a company like Enron may look like a great success.
Further, corporate boards follow other companies in setting pay even if the directors themselves disagree with lavish pay "partly to attract executives whom they consider adequate, partly because the financial market will be suspicious of a company whose CEO isn't lavishly paid." Finally "corporate boards, largely selected by the CEO, hire compensation experts, almost always chosen by the CEO" who naturally want to please their employers.
Lucian Arye Bebchuk, Jesse M. Fried, the authors of Pay Without Performance, critique of executive pay, argue that executive capture of corporate governance is so complete that only public relations, i.e. public `outrage`, constrains their pay. This in turn has been reduced as traditional critics of excessive pay – such as politicians (where need for campaign contributions from the richest outweighs populist indignation), media (lauding business genius), unions (crushed) – are now silent.
In addition to politics, Krugman postulated change in norms of corporate culture have played a factor. In the 1950s and 60s, corporate executives had (or could develop) the ability to pay themselves very high compensation through control of corporate boards of directors, they restrained themselves.
But by the end of the 1990s, the average real annual compensation of the top 100 C.E.O.'s skyrocketed from $1.3 million – 39 times the pay of an average worker – to $37.5 million, more than 1,000 times the pay of ordinary workers from 1982 to 2002.
Journalist George Packer also sees the dramatic increase in inequality in America as a product of the change in attitude of the American elite, which (in his view) has been transitioning itself from pillars of society to a special interest group.
Author Timothy Noah estimates that what he calls "Wall Street and corporate boards' pampering" of the highest earning 0.1% is "responsible for 30%" of the post-1978 increase in inequality.
Immigration:
The Immigration and Nationality Act of 1965 increased immigration to America, especially of non-Europeans. From 1970 to 2007, the foreign-born proportion of America's population grew from 5% to 11%, most of whom had lower education levels and incomes than native-born Americans.
But the contribution of this increase in supply of low-skill labor seem to have been relatively modest. One estimate stated that immigration reduced the average annual income of native-born "high-school dropouts" ("who roughly correspond to the poorest tenth of the workforce") by 7.4% from 1980 to 2000.
The decline in income of better educated workers was much less. Author Timothy Noah estimates that "immigration" is responsible for just 5% of the "Great Divergence" in income distribution, as does economist David Card.
While immigration was found to have slightly depressed the wages of the least skilled and least educated American workers, it doesn't explain rising inequality among high school and college graduates.
Scholars such as political scientists Jacob S. Hacker, Paul Pierson, Larry Bartels and Nathan Kelly, and economist Timothy Smeeding question the explanation of educational attainment and workplace skills point out that other countries with similar education levels and economies have not gone the way of the US, and that the concentration of income in the US hasn't followed a pattern of "the 29% of Americans with college degrees pulling away" from those who have less education.
Wage theft:
A September 2014 report by the Economic Policy Institute claims wage theft is also responsible for exacerbating income inequality: "Survey evidence suggests that wage theft is widespread and costs workers billions of dollars a year, a transfer from low-income employees to business owners that worsens income inequality, hurts workers and their families, and damages the sense of fairness and justice that a democracy needs to survive."
Corporatism:
See also: Corporatocracy
Edmund Phelps published an analysis in 2010 theorizing that the cause of income inequality is not free market capitalism, but instead is the result of the rise of corporatism.
Corporatism, in his view, is the antithesis of free market capitalism. It is characterized by semi-monopolistic organizations and banks, big employer confederations, often acting with complicit state institutions in ways that discourage (or block) the natural workings of a free economy.
The primary effects of corporatism are the consolidation of economic power and wealth with end results being the attrition of entrepreneurial and free market dynamism.
His follow-up book, Mass Flourishing, further defines corporatism by the following attributes:
Today, in the United States, virtually all of these economic conditions are being borne out. With regard to income inequality, the 2014 income analysis of University of California, Berkeley economist Emmanuel Saez confirms that relative growth of income and wealth is not occurring among small and mid-sized entrepreneurs and business owners (who generally populate the lower half of top one per-centers in income), but instead only among the top .1 percent of income distribution ... whom Paul Krugman describes as "super-elites - corporate bigwigs and financial wheeler-dealers." ... who earn $2,000,000 or more every year.
For example, measured relative to GDP, total compensation and its component wages and salaries have been declining since 1970. This indicates a shift in income from labor (persons who derive income from hourly wages and salaries) to capital (persons who derive income via ownership of businesses, land and assets).
Wages and salaries have fallen from approximately 51% GDP in 1970 to 43% GDP in 2013. Total compensation has fallen from approximately 58% GDP in 1970 to 53% GDP in 2013.
To put this in perspective, five percent of U.S. GDP was approximately $850 billion in 2013. This represents an additional $7,000 in wages and salaries for each of the 120 million U.S. households. Larry Summers estimated in 2007 that the lower 80% of families were receiving $664 billion less income than they would be with a 1979 income distribution (a period of much greater equality), or approximately $7,000 per family.
Not receiving this income may have led many families to increase their debt burden, a significant factor in the 2007-2009 subprime mortgage crisis, as highly leveraged homeowners suffered a much larger reduction in their net worth during the crisis. Further, since lower income families tend to spend relatively more of their income than higher income families, shifting more of the income to wealthier families may slow economic growth.
In another example, The Economist propounds that a swelling corporate financial and banking sector has caused Gini Coefficients to rise in the U.S. since 1980: "Financial services' share of GDP in America doubled to 8% between 1980 and 2000; over the same period their profits rose from about 10% to 35% of total corporate profits, before collapsing in 2007–09. Bankers are being paid more, too. In America the compensation of workers in financial services was similar to average compensation until 1980. Now it is twice that average."
The summary argument, considering these findings, is that if corporatism is the consolidation and sharing of economic and political power between large corporations and the state ... then a corresponding concentration of income and wealth (with resulting income inequality) is an expected by-product of such a consolidation.
Neoliberalism:
See also: Market fundamentalism
Some economists, sociologists and anthropologists argue that neoliberalism, or the resurgence of 19th century theories relating to laissez-faire economic liberalism in the late 1970s, has been the significant driver of inequality.
More broadly, according to The Handbook of Neoliberalism, the term has "become a means of identifying a seemingly ubiquitous set of market-oriented policies as being largely responsible for a wide range of social, political, ecological and economic problems."
Vicenç Navarro points to policies pertaining to the deregulation of labor markets, privatization of public institutions, union busting and reduction of public social expenditures as contributors to this widening disparity.
The privatization of public functions, for example, grows income inequality by depressing wages and eliminating benefits for middle class workers while increasing income for those at the top. The deregulation of the labor market undermined unions by allowing the real value of the minimum wage to plummet, resulting in employment insecurity and widening wage and income inequality.
David M. Kotz, professor of economics at the University of Massachusetts Amherst, contends that neoliberalism "is based on the thorough domination of labor by capital." As such, the advent of the neoliberal era has seen a sharp increase in income inequality through the decline of unionization, stagnant wages for workers and the rise of CEO supersalaries.
According to Emmanuel Saez: The labor market has been creating much more inequality over the last thirty years, with the very top earners capturing a large fraction of macroeconomic productivity gains.
A number of factors may help explain this increase in inequality, not only underlying technological changes but also the retreat of institutions developed during the New Deal and World War II - such as progressive tax policies, powerful unions, corporate provision of health and retirement benefits, and changing social norms regarding pay inequality.
Pennsylvania State University political science professor Pamela Blackmon attributes the trends of growing poverty and income inequality to the convergence of several neoliberal policies during Ronald Reagan's presidency, including the decreased funding of education, decreases in the top marginal tax rates, and shifts in transfer programs for those in poverty.
Journalist Mark Bittman echoes this sentiment in a 2014 piece for The New York Times:
"The progress of the last 40 years has been mostly cultural, culminating, the last couple of years, in the broad legalization of same-sex marriage. But by many other measures, especially economic, things have gotten worse, thanks to the establishment of neo-liberal principles — anti-unionism, deregulation, market fundamentalism and intensified, unconscionable greed — that began with Richard Nixon and picked up steam under Ronald Reagan. Too many are suffering now because too few were fighting then."
Fred L. Block and Margaret Somers, in expanding on Karl Polanyi's critique of laissez-faire theories in The Great Transformation, argue that Polanyi's analysis helps to explain why the revival of such ideas has contributed to the "persistent unemployment, widening inequality, and the severe financial crises that have stressed Western economies over the past forty years."
John Schmitt and Ben Zipperer of the Center for Economic and Policy Research also point to economic liberalism as one of the causes of income inequality. They note that European nations, in particular the social democracies of Northern Europe with extensive and well funded welfare states, have lower levels of income inequality and social exclusion than the United States.
See also:
Overview:
Income inequality in the United States (previous topic) has grown significantly since the early 1970s, after several decades of stability, and has been the subject of study of many scholars and institutions. The U.S. consistently exhibits higher rates of income inequality than most developed nations, arguably due to the nation's relatively enhanced support of free market capitalism.
According to the CBO and others, "the precise reasons for the [recent] rapid growth in income at the top are not well understood", but "in all likelihood," an "interaction of multiple factors" was involved. "Researchers have offered several potential rationales." Some of these rationales conflict, some overlap. They include:
- the globalization hypothesis – low skilled American workers have been losing ground in the face of competition from low-wage workers in Asia and other "emerging" economies;
- skill-biased technological change – the rapid pace of progress in information technology has increased the demand for the highly skilled and educated so that income distribution favored brains rather than brawn;
- the superstar hypothesis – modern technologies of communication often turn competition into a tournament in which the winner is richly rewarded, while the runners-up get far less than in the past;
- immigration of less-educated workers – relatively high levels of immigration of low skilled workers since 1965 may have reduced wages for American-born high school dropouts;
- changing institutions and norms – Unions were a balancing force, helping ensure wages kept up with productivity and that neither executives nor shareholders were unduly rewarded. Further, societal norms placed constraints on executive pay. This changed as union power declined (the share of unionized workers fell significantly during the Great Divergence, from over 30% to around 12%) and CEO pay skyrocketed (rising from around 40 times the average workers pay in the 1970s to over 350 times in the early 2000s).
- policy, politics and race – movement conservatives increased their influence over the Republican Party beginning in the 1970s, moving it politically rightward. Combined with the Party's expanded political power (enabled by a shift of southern white Democrats to the Republican Party following the passage of Civil Rights legislation in the 1960s), this resulted in more regressive tax laws, anti-labor policies, and further limited expansion of the welfare state relative to other developed nations (e.g., the unique absence of universal healthcare).
Paul Krugman put several of these factors into context in January 2015: "Competition from emerging-economy exports has surely been a factor depressing wages in wealthier nations, although probably not the dominant force. More important, soaring incomes at the top were achieved, in large part, by squeezing those below: by cutting wages, slashing benefits, crushing unions, and diverting a rising share of national resources to financial wheeling and dealing ... Perhaps more important still, the wealthy exert a vastly disproportionate effect on policy. And elite priorities — obsessive concern with budget deficits, with the supposed need to slash social programs — have done a lot to deepen [wage stagnation and income inequality]."
Divergence of productivity and compensation:
Overall:
One view of economic equity is that employee compensation should rise with productivity (defined as real output per hour of labor worked). In other words, if the employee produces more, they should be paid accordingly. If pay lags behind productivity, income inequality grows, as labor's share of the output is falling, while capital's share (generally higher-income owners) is rising.
According to a June 2017 report from the non-partisan Bureau of Labor Statistics (BLS), productivity rose in tandem with employee compensation (a measure which includes wages as well as benefits such as health insurance) from the 1940s through the 1970s. However, since then productivity has grown faster than compensation.
BLS refers to this as the "productivity-compensation gap", an issue which has garnered much attention from academics and policymakers. BLS reported this gap occurs across most industries: "When examined at a detailed industry level, the average annual percent change in productivity outpaced compensation in 83 percent of 183 industries studied" measured from 1987-2015.
For example, in the information industry, productivity increased at an annual average rate of 5.0% over the 1987-2015 period, while compensation increased at about a 1.5% rate, resulting in a 3.5% productivity gap. In Manufacturing, the gap was 2.7%; in Retail Trade 2.6%; and in Transportation and Warehousing 1.3%. This analysis adjusted for inflation using the Consumer Price Index or CPI, a measure of inflation based on what is consumed, rather than what is produced.
Analyzing the gap:
BLS explained the gap between productivity and compensation can be divided into two components, the effect of which varies by industry: 1) Recalculating the gap using an industry-specific inflation adjustment ("industry deflator") rather than consumption (CPI); and 2) The change in labor's share of income, defined as how much of a business' revenue goes to workers as opposed to intermediate purchases (i.e., cost of goods) and capital (owners) in that industry.
The difference in deflators was the stronger effect among high productivity growth industries, while the change in labor's share of income was the stronger effect among most other industries. For example, the 3.5% productivity gap in the information industry was composed of a 2.1% difference in deflators and about a 1.4% due to change in labor share.
The 2.7% gap in Manufacturing included 1.0% due to deflation and 1.7% due to change in labor share.
Reasons for the gap:
BLS explained the decline in labor share as likely driven by three factors that vary by industry:
- Globalization: Income that might have gone to domestic workers is going to foreign workers due to offshoring (i.e., production and service activities in other countries).
- Increased automation: More automation means more share of income attributed to capital.
- Faster capital depreciation: Information assets depreciate more rapidly than machinery; the latter were the greater share of the capital base in the past. This may require a higher capital share to generate income than in the past.
Market Factors:
Globalization:
Globalization refers to the integration of economies in terms of trade, information, and jobs. Innovations in supply chain management enabled goods to be sourced in Asia and shipped to the United States less expensively than in the past. This integration of economies, particularly with the U.S. and Asia, had dramatic impacts on income inequality globally.
Economist Branko Milanovic analyzed global income inequality, comparing 1988 and 2008. His analysis indicated that the global top 1% and the middle classes of the emerging economies (e.g., China, India, Indonesia, Brazil and Egypt) were the main winners of globalization during that time.
The real (inflation adjusted) income of the global top 1% increased approximately 60%, while the middle classes of the emerging economies (those around the 50th percentile of the global income distribution in 1988) rose 70–80%.
For example, in 2000, 5 million Chinese households earned between $11,500 and $43,000 in 2016 dollars. By 2015, 225 million did. On the other hand, those in the middle class of the developed world (those in the 75th to 90th percentile in 1988, such as the American middle class) experienced little real income gains.
The richest 1% contains 60 million persons globally, including 30 million Americans (i.e., the top 12% of Americans by income were in the global top 1% in 2008), the most out of any country.
While economists who have studied globalization agree imports have had an effect, the timing of import growth does not match the growth of income inequality. By 1995 imports of manufactured goods from low-wage countries totalled less than 3% of US gross domestic product.
It wasn't until 2006 that the US imported more manufactured goods from low-wage (developing) countries than from high-wage (advanced) economies. Inequality increased during the 2000–2010 decade not because of stagnating wages for less-skilled workers, but because of accelerating incomes of the top 0.1%. Author Timothy Noah estimates that "trade", increases in imports are responsible for just 10% of the "Great Divergence" in income distribution.
Journalist James Surowiecki notes that in the last 50 years, companies and the sectors of the economy providing the most employment in the US – major retailers, restaurant chains, and supermarkets – are ones with lower profit margins and less pricing power than in the 1960s; while sectors with high profit margins and average salaries – like high technology – have relatively few employees.
Some economists claim that it is WTO-led globalization and competition from developing countries, especially China, that has resulted in the recent decline in labor's share of income and increased unemployment in the U.S. And the Economic Policy Institute and the Center for Economic and Policy Research argue that some trade agreements such as the Trans-Pacific Partnership could result in further job losses and declining wages.
One argument contrary to the globalization/technology hypothesis relates to variation across countries. Japan, Sweden and France did not experience significant increases in income inequality during the 1979–2010 period, although the U.S. did.
The top 1% income group continued to receive less than 10% of the income share in these countries, while the U.S. share rose from 10% to over 20%. Economist Emmanuel Saez wrote in 2014: "Differences across countries rule out technical change/globalization as the sole explanation ... Policies play a key role in shaping inequality (tax and transfer policies, regulations, education)."
Superstar hypothesis:
Eric Posner and Glen Weyl point out that inequality can be predominantly explained by the superstar hypothesis. In their opinion Piketty fails to observe the accelerated turnover that is occurring in the Forbes 400; only 35 people from the original 1982 list remain today. Many have fallen off as a result of heavy spending, large-scale philanthropy, and bad investments.
The current Forbes 400 is now primarily made up of newly wealthy business owners, not heirs and heiresses. In parallel research, the University of Chicago's Steven Kaplan and Stanford University's Joshua Rauh note that 69% of those on the Forbes list are actually first generation wealth creators. That figure has risen dramatically since 1982 when it stood at 40%.
Ed Dolan supports the globalization and superstar hypothesis but points out that the high earnings are based, to some extent, on moral hazard like "Bonus-based compensation schemes with inadequate clawback for losses" and the shift of losses to shareholders, unsecured creditors, or taxpayers.
Paul Krugman argues that for the US the surge in inequality to date is mainly due to supersalaries but capital has nonetheless been significant too. And when the current generation of the 1% turn over their wealth to their heirs these become rentiers, people who live off accumulated capital. Two decades from now America could turn into a rentier-dominated society even more unequal than Belle Époque Europe.
One study extended the superstar hypothesis to corporations, with firms that are more dominant in their industry (in some cases due to oligopoly or monopoly) paying their workers far more than the average in the industry. Another study noted that "superstar firms" is another explanation for the decline in the overall share of income (GDP) going to workers/labor as opposed to owners/capital.
Education:
Main article: Educational attainment in the United States
Income differences between the varying levels of educational attainment (usually measured by the highest degree of education an individual has completed) have increased. Expertise and skill certified through an academic degree translates into increased scarcity of an individual's occupational qualification which in turn leads to greater economic rewards.
As the United States has developed into a post-industrial society more and more employers require expertise that they did not a generation ago, while the manufacturing sector which employed many of those lacking a post-secondary education is decreasing in size.
In the resulting economic job market the income discrepancy between the working class and the professional with the higher academic degrees, who possess scarce amounts of certified expertise, may be growing.
Households in the upper quintiles are generally home to more, better educated and employed working income earners, than those in lower quintiles.
Among those in the upper quintile, 62% of householders were college graduates, 80% worked full-time and 76% of households had two or more income earners, compared to the national percentages of 27%, 58% and 42%, respectively. Upper-most sphere US Census Bureau data indicated that occupational achievement and the possession of scarce skills correlates with higher income.
Average earnings in 2002 for the population 18 years and over were higher at each progressively higher level of education ... This relationship holds true not only for the entire population but also across most subgroups. Within each specific educational level, earnings differed by sex and race. This variation may result from a variety of factors, such as occupation, working full- or part-time, age, or labor force experience.
The "college premium" refers to the increase in income to workers with four-year college degrees relative to those without. The college premium doubled from 1980 to 2005, as the demand for college-educated workers has exceeded the supply.
Economists Goldin and Katz estimate that the increase in economic returns to education was responsible for about 60% of the increase in wage inequality between 1973 and 2005. The supply of available graduates did not keep up with business demand due primarily to increasingly expensive college educations.
Annual tuition at public and private universities averaged 4% and 20% respectively of the annual median family income from the 1950s to 1970s; by 2005 these figures were 10% and 45% as colleges raised prices in response to demand.
Economist David Autor wrote in 2014 that approximately two-thirds of the rise in income inequality between 1980 and 2005 was accounted for by the increased premium associated with education in general and post-secondary education in particular.
Two researchers have suggested that children in low income families are exposed to 636 words an hour, as opposed to 2,153 words in high income families during the first four formative years of a child's development. This, in turn, led to low achievement in later schooling due to the inability of the low income group to verbalize concepts.
A psychologist has stated that society stigmatizes poverty. Conversely, poor people tend to believe that the wealthy have been lucky or have earned their money through illegal means. She believes that both attitudes need to be discarded if the nation is to make headway in addressing the issue of inequality. She suggests that college not be a litmus test of success; that valorizing of one profession as more important than another is a problem.
Skill-biased technological change:
As of the mid- to late- decade of the 2000s, the most common explanation for income inequality in America was "skill-biased technological change" (SBTC) – "a shift in the production technology that favors skilled over unskilled labor by increasing its relative productivity and, therefore, its relative demand".
For example, one scholarly colloquium on the subject that included many prominent labor economists estimated that technological change was responsible for over 40% of the increase in inequality.
Other factors like international trade, decline in real minimum wage, decline in unionization and rising immigration, were each responsible for 10–15% of the increase.
Education has a notable influence on income distribution. In 2005, roughly 55% of income earners with doctorate degrees – the most educated 1.4% – were among the top 15 percent earners. Among those with Master's degrees – the most educated 10% – roughly half had incomes among the top 20 percent of earners. Only among households in the top quintile were householders with college degrees in the majority.
But while the higher education commonly translates into higher income, and the highly educated are disproportionately represented in upper quintile households, differences in educational attainment fail to explain income discrepancies between the top 1 percent and the rest of the population.
Large percentages of individuals lacking a college degree are present in all income demographics, including 33% of those with heading households with six figure incomes. From 2000 to 2010, the 1.5% of Americans with an M.D., J.D., or M.B.A. and the 1.5% with a PhD saw median income gains of approximately 5%.
Among those with a college or master's degree (about 25% of the American workforce) average wages dropped by about 7%, (though this was less than the decline in wages for those who had not completed college). Post-2000 data has provided "little evidence" for SBTC's role in increasing inequality. The wage premium for college educated has risen little and there has been little shift in shares of employment to more highly skilled occupations.
Approaching the issue from occupations that have been replaced or downgraded since the late 1970s, one scholar found that jobs that "require some thinking but not a lot" – or moderately skilled middle-class occupations such as cashiers, typists, welders, farmers, appliance repairmen – declined the furthest in wage rates and/or numbers. Employment requiring either more skill or less has been less affected.
However the timing of the great technological change of the era – internet use by business starting in the late 1990s – does not match that of the growth of income inequality (starting in the early 1970s but slackening somewhat in the 1990s).
Nor does the introduction of technologies that increase the demand for more skilled workers seem to be generally associated with a divergence in household income among the population. Inventions of the 20th century such as AC electric power, the automobile, airplane, radio, television, the washing machine, Xerox machine, each had an economic impact similar to computers, microprocessors and internet, but did not coincide with greater inequality.
Another explanation is that the combination of the introduction of technologies that increase the demand for skilled workers, and the failure of the American education system to provide a sufficient increase in those skilled workers has bid up those workers' salaries.
An example of the slowdown in education growth in America (that began about the same time as the Great Divergence began) is the fact that the average person born in 1945 received two more years of schooling than his parents, while the average person born in 1975 received only half a year more of schooling.
Author Timothy Noah's "back-of-the-envelope" estimation based on "composite of my discussions with and reading of the various economists and political scientists" is that the "various failures" in America's education system are "responsible for 30%" of the post-1978 increase in inequality.
Race and Gender Disparities:
Further information: Gender pay gap in the United States and Racial wage gap in the United States
Income levels vary by gender and race with median income levels considerably below the national median for females compared to men with certain racial demographics.
Despite considerable progress in pursuing gender and racial equality, some social scientists like Richard Schaeffer attribute these discrepancies in income partly to continued discrimination.
Among women, part of the wage gap is due to employment choices and preferences. Women are more likely to consider factors other than salary when looking for employment. On average, women are less willing to travel or relocate, take more hours off and work fewer hours, and choose college majors that lead to lower paying jobs. Women are also more likely to work for governments or non-profits which pay less than the private sector.
According to this perspective certain ethnic minorities and women receive fewer promotions and opportunities for occupation and economic advancement than others. In the case of women this concept is referred to as the glass ceiling keeping women from climbing the occupational ladder.
In terms of race, Asian Americans are far more likely to be in the highest earning 5 percent than the rest of Americans. Studies have shown that African Americans are less likely to be hired than White Americans with the same qualifications.
The continued prevalence of traditional gender roles and ethnic stereotypes may partially account for current levels of discrimination. In 2005, median income levels were highest among Asian and White males and lowest among females of all races, especially those identifying as African American or Hispanic.
Despite closing gender and racial gaps, considerable discrepancies remain among racial and gender demographics, even at the same level of educational attainment. The economic success of Asian Americans may come from how they devote much more time to education than their peers. Asian Americans have significantly higher college graduation rates than their peers and are much more likely to enter high status and high income occupations.
Since 1953 the income gap between male and female workers has decreased considerably but remains relatively large. Women currently earn significantly more Associate's, Bachelor's, and Master's degrees than men and almost as many Doctorates. Women are projected to have passed men in Doctorates earned in 2006–2007, and to earn nearly two thirds of Associate's, Bachelor's, and Master's degrees by 2016.
Though it is important to note that income inequality between sexes remained stark at all levels of educational attainment. Between 1953 and 2005 median earnings as well as educational attainment increased, at a far greater pace for women than for men. Median income for female earners male earners increased 157.2% versus 36.2% for men, over four times as fast.
Today the median male worker earns roughly 68.4% more than their female counterparts, compared to 176.3% in 1953. The median income of men in 2005 was 2% higher than in 1973 compared to a 74.6% increase for female earners.
Racial differences remained stark as well, with the highest earning sex-gender demographic of workers aged 25 or older, Asian males (who were roughly tied with white males) earning slightly more than twice as much as the lowest-earning demographic, Hispanic females.
As mentioned above, inequality between races and gender persisted at similar education levels. Racial differences were overall more pronounced among male than among female income earners. In 2009, Hispanics were more than twice as likely to be poor than non-Hispanic whites, research indicates.
Lower average English ability, low levels of educational attainment, part-time employment, the youthfulness of Hispanic household heads, and the 2007–09 recession are important factors that have pushed up the Hispanic poverty rate relative to non-Hispanic whites.
During the early 1920s, median earnings decreased for both sexes, not increasing substantially until the late 1990s. Since 1974 the median income for workers of both sexes increased by 31.7% from $18,474 to $24,325, reaching its high-point in 2000.
Incentives:
In the context of concern over income inequality, a number of economists, such as Federal Reserve Chairman Ben Bernanke, have talked about the importance of incentives: "... without the possibility of unequal outcomes tied to differences in effort and skill, the economic incentive for productive behavior would be eliminated, and our market-based economy ... would function far less effectively."
Since abundant supply decreases market value, the possession of scarce skills considerably increases income. Among the American lower class, the most common source of income was not occupation, but government welfare.
Stock buybacks:
Writing in the Harvard Business Review in September 2014, William Lazonick blamed record corporate stock buybacks for reduced investment in the economy and a corresponding impact on prosperity and income inequality.
Between 2003 and 2012, the 449 companies in the S&P 500 used 54% of their earnings ($2.4 trillion) to buy back their own stock. An additional 37% was paid to stockholders as dividends. Together, these were 91% of profits. This left little for investment in productive capabilities or higher income for employees, shifting more income to capital rather than labor.
Lazonick blamed executive compensation arrangements, which are heavily based on stock options, stock awards and bonuses for meeting earnings per share (EPS) targets (EPS increases as the number of outstanding shares decreases). Restrictions on buybacks were greatly eased in the early 1980s. He advocates changing these incentives to limit buybacks.
U.S. companies are projected to increase buybacks to $701 billion in 2015 according to Goldman Sachs, an 18% increase over 2014. For scale, annual non-residential fixed investment (a proxy for business investment and a major GDP component) was estimated to be about $2.1 trillion for 2014.
Journalist Timothy Noah wrote in 2012 that: "My own preferred hypothesis is that stockholders appropriated what once belonged to middle-class wage earners." Since the vast majority of stocks are owned by higher income households, this contributes to income inequality.
Journalist Harold Meyerson wrote in 2014 that: "The purpose of the modern U.S. corporation is to reward large investors and top executives with income that once was spent on expansion, research, training and employees."
Tax and transfer policies:
Main article: Tax policy and economic inequality in the United States
Background:
U.S. income inequality is comparable to other developed nations pre-tax, but is among the worst after-tax and transfers. This indicates the U.S. tax policies redistribute income from higher income to lower income households relatively less than other developed countries.
Journalist Timothy Noah summarized the results of several studies his 2012 book The Great Divergence:
- Economists Piketty and Saez reported in 2007, that U.S. taxes on the rich had declined over the 1979–2004 period, contributing to increasing after-tax income inequality. While dramatic reductions in the top marginal income tax rate contributed somewhat to worsening inequality, other changes to the tax code (e.g., corporate, capital gains, estate, and gift taxes) had more significant impact. Considering all federal taxes, including the payroll tax, the effective tax rate on the top 0.01% fell dramatically, from 59.3% in 1979 to 34.7% in 2004. CBO reported an effective tax rate decline from 42.9% in 1979 to 32.3% in 2004 for the top 0.01%, using a different income measurement. In other words, the effective tax rate on the very highest income taxpayers fell by about one-quarter.
- CBO estimated that the combined effect of federal taxes and government transfers reduced income inequality (as measured by the Gini Index) by 23% in 1979. By 2007, the combined effect was to reduce income inequality by 17%. So the tax code remained progressive, only less so.
- While pre-tax income is the primary driver of income inequality, the less progressive tax code further increased the share of after-tax income going to the highest income groups. For example, had these tax changes not occurred, the after-tax income share of the top 0.1% would have been approximately 4.5% in 2000 instead of the 7.3% actual figure.
Income taxes:
A key factor in income inequality/equality is the effective rate at which income is taxed coupled with the progressivity of the tax system. A progressive tax is a tax in which the effective tax rate increases as the taxable base amount increases. Overall income tax rates in the U.S. are below the OECD average, and until 2005 have been declining.
How much tax policy change over the last thirty years has contributed to income inequality is disputed. In their comprehensive 2011 study of income inequality (Trends in the Distribution of Household Income Between 1979 and 2007), the CBO found that the top fifth of the population saw a 10-percentage-point increase in their share of after-tax income.
Most of that growth went to the top 1 percent of the population. All other groups saw their shares decline by 2 to 3 percentage points. In 2007, federal taxes and transfers reduced the dispersion of income by 20 percent, but that equalizing effect was larger in 1979. The share of transfer payments to the lowest-income households declined. The overall average federal tax rate fell.
However, a more recent CBO analysis indicates that with changes to 2013 tax law (e.g., the expiration of the 2001-2003 Bush tax cuts for top earners and the increased payroll taxes passed as part of the Affordable Care Act), the effective federal tax rates for the highest earning household will increase to levels not seen since 1979.
According to journalist Timothy Noah, "you can't really demonstrate that U.S. tax policy had a large impact on the three-decade income inequality trend one way or the other. The inequality trend for pre-tax income during this period was much more dramatic." Noah estimates tax changes account for 5% of the Great Divergence.
But many – such as economist Paul Krugman – emphasize the effect of changes in taxation – such as the 2001 and 2003 Bush administration tax cuts which cut taxes far more for high-income households than those below – on increased income inequality.
Part of the growth of income inequality under Republican administrations (described by Larry Bartels) has been attributed to tax policy. A study by Thomas Piketty and Emmanuel Saez found that "large reductions in tax progressivity since the 1960s took place primarily during two periods: the Reagan presidency in the 1980s and the Bush administration in the early 2000s."
During Republican President Ronald Reagan's tenure in office the top marginal income tax rate was reduced from over 70 to 28 percent, high top marginal rates like 70% being the sort in place during much of the period of great income equality following the "Great Compression".
The lowest marginal rate for the bottom fell from 14 to 11 percent. However the effective rate on top earners before Reagan's tax cut was much lower because of loopholes and charitable contributions.
Taxes on capital:
Taxes on income derived from capital (e.g., financial assets, property and businesses) primarily affect higher income groups, who own the vast majority of capital. For example, in 2010 approximately 81% of stocks were owned by the top 10% income group and 69% by the top 5%.
Only about one-third of American households have stock holdings more than $7,000. Therefore, since higher-income taxpayers have a much higher share of their income represented by capital gains, lowering taxes on capital income and gains increases after-tax income inequality.
Capital gains taxes were reduced around the time income inequality began to rise again around 1980 and several times thereafter. During 1978 under President Carter, the top capital gains tax rate was reduced from 49% to 28%. President Ronald Reagan's 1981 cut in the top rate on unearned income reduced the maximum capital gains rate to only 20% – its lowest level since the Hoover administration, as part of an overall economic growth strategy.
The capital gains tax rate was also reduced by President Bill Clinton in 1997, from 28% to 20%. President George W. Bush reduced the tax rate on capital gains and qualifying dividends from 20% to 15%, less than half the 35% top rate on ordinary income.
CBO reported in August 1990 that: "Of the 8 studies reviewed, five, including the two CBO studies, found that cutting taxes on capital gains is not likely to increase savings, investment, or GNP much if at all." Some of the studies indicated the loss in revenue from lowering the tax rate may be offset by higher economic growth, others did not.
Journalist Timothy Noah wrote in 2012 that: "Every one of these changes elevated the financial interests of business owners and stockholders above the well-being, financial or otherwise, or ordinary citizens." So overall, while cutting capital gains taxes adversely affects income inequality, its economic benefits are debatable.
Other tax policies:
Rising inequality had also been attributed to President Bush's veto of tax harmonization, as this would have prohibited offshore tax havens.
Debate over effects of tax policies:
One study found reductions of total effective tax rates were most significant for individuals with highest incomes. (see "Federal Tax Rate by Income Group" chart) For those with incomes in the top 0.01 percent, overall rates of Federal tax fell from 74.6% in 1970, to 34.7% in 2004 (the reversal of the trend in 2000 with a rise to 40.8% came after the 1993 Clinton deficit reduction tax bill), the next 0.09 percent falling from 59.1% to 34.1%, before leveling off with a relatively modest drop of 41.4 to 33.0% for the 99.5–99.9 percent group.
Although the tax rate for low-income earners fell as well (though not as much), these tax reductions compare with virtually no change – 23.3% tax rate in 1970, 23.4% in 2004 – for the US population overall.
The study found the decline in progressivity since 1960 was due to the shift from allocation of corporate income taxes among labor and capital to the effects of the individual income tax. Paul Krugman also supports this claim saying, "The overall tax rate on these high income families fell from 36.5% in 1980 to 26.7% in 1989."
From the White House's own analysis, the federal tax burden for those making greater than $250,000 fell considerably during the late 1980s, 1990s and 2000s, from an effective tax of 35% in 1980, down to under 30% from the late 1980s to 2011.
Many studies argue that tax changes of S corporations confound the statistics prior to 1990. However, even after these changes inflation-adjusted average after-tax income grew by 25% between 1996 and 2006 (the last year for which individual income tax data is publicly available).
This average increase, however, obscures a great deal of variation. The poorest 20% of tax filers experienced a 6% reduction in income while the top 0.1 percent of tax filers saw their income almost double. Tax filers in the middle of the income distribution experienced about a 10% increase in income. Also during this period, the proportion of income from capital increased for the top 0.1 percent from 64% to 70%.
Transfer payments:
Transfer payments refer to payments to persons such as social security, unemployment compensation, or welfare. CBO reported in November 2014 that: "Government transfers reduce income inequality because the transfers received by lower-income households are larger relative to their market income than are the transfers received by higher-income households.
Federal taxes also reduce income inequality, because the taxes paid by higher-income households are larger relative to their before-tax income than are the taxes paid by lower-income households. The equalizing effects of government transfers were significantly larger than the equalizing effects of federal taxes from 1979 to 2011.
CBO also reported that less progressive tax and transfer policies have contributed to greater after-tax income inequality: "As a result of the diminishing effect of transfers and federal taxes, the Gini index for income after transfers and federal taxes grew by more than the index for market income.
Between 1979 and 2007, the Gini index for market income increased by 23 percent, the index for market income after transfers increased by 29 percent, and the index for income measured after transfers and federal taxes increased by 33 percent."
Tax expenditures:
Tax expenditures (i.e., exclusions, deductions, preferential tax rates, and tax credits) cause revenues to be much lower than they would otherwise be for any given tax rate structure.
The benefits from tax expenditures, such as income exclusions for healthcare insurance premiums paid for by employers and tax deductions for mortgage interest, are distributed unevenly across the income spectrum. They are often what the Congress offers to special interests in exchange for their support. According to a report from the CBO that analyzed the 2013 data:
- The top 10 tax expenditures totalled $900 billion. This is a proxy for how much they reduced revenues or increased the annual budget deficit.
- Tax expenditures tend to benefit those at the top and bottom of the income distribution, but less so in the middle.
- The top 20% of income earners received approximately 50% of the benefit from them; the top 1% received 17% of the benefits.
- The largest single tax expenditure was the exclusion from income of employer sponsored health insurance ($250 billion).
- Preferential tax rates on capital gains and dividends were $160 billion; the top 1% received 68% of the benefit or $109 billion from lower income tax rates on these types of income.
Understanding how each tax expenditure is distributed across the income spectrum can inform policy choices.
Other Causes:
Shifts in political power:
Paul Krugman wrote in 2015 that: "Economists struggling to make sense of economic polarization are, increasingly, talking not about technology but about power."
This market power hypothesis basically asserts that market power has concentrated in monopolies and oligopolies that enable unusual amounts of income ("rents") to be transferred from the many consumers to relatively few owners. This hypothesis is consistent with higher corporate profits without a commensurate rise in investment, as firms facing less competition choose to pass a greater share of their profits to shareholders (such as through share buybacks and dividends) rather than re-invest in the business to ward off competitors.
One cause of this concentration of market power was the rightward shift in American politics toward more conservative policies since 1980, as politics plays a big role in how market power can be exercised.
Policies that removed barriers to monopoly and oligopoly included anti-union laws, reduced anti-trust activity, deregulation (or failure to regulate) non-depository banking, contract laws that favored creditors over debtors, etc. Further, rising wealth concentration can be used to purchase political influence, creating a feedback loop.
Decline of unions:
Further information: Labor unions in the United States
The era of inequality growth has coincided with a dramatic decline in labor union membership from 20% of the labor force in 1983 to about 12% in 2007. Classical and neoclassical economists have traditionally thought that since the chief purpose of a union is to maximize the income of its members, a strong but not all-encompassing union movement would lead to increased income inequality.
However, given the increase in income inequality of the past few decades, either the sign of the effect must be reversed, or the magnitude of the effect must be small and a much larger opposing force has overridden it.
However, more recently, research has shown that unions' ability to reduce income disparities among members outweighed other factors and its net effect has been to reduce national income inequality.
The decline of unions has hurt this leveling effect among men, and one economist (Berkeley economist David Card) estimating about 15–20% of the "Great Divergence" among that gender is the result of declining unionization.
According to scholars, "As organized labor's political power dissipates, economic interests in the labor market are dispersed and policy makers have fewer incentives to strengthen unions or otherwise equalize economic rewards."
Unions were a balancing force, helping ensure wages kept up with productivity and that neither executives nor shareholders were unduly rewarded. Further, societal norms placed constraints on executive pay.
This changed as union power declined (the share of unionized workers fell significantly during the Great Divergence, from over 30% to around 12%) and CEO pay skyrocketed (rising from around 40 times the average workers pay in the 1970s to over 350 times in the early 2000s).
A 2015 report by the International Monetary Fund also attributes the decline of labor's share of GDP to de-unionization, noting the trend "necessarily increases the income share of corporate managers' pay and shareholder returns ... Moreover, weaker unions can reduce workers' influence on corporate decisions that benefit top earners, such as the size and structure of top executive compensation."
Still other researchers think it is the labor movement's loss of national political power to promote equalizing "government intervention and changes in private sector behavior" has had the greatest impact on inequality in the US. Sociologist Jake Rosenfeld of the University of Washington argues that labor unions were the primary institution fighting inequality in the United States and helped grow a multiethnic middle class, and their decline has resulted in diminishing prospects for U.S. workers and their families.
Timothy Noah estimates the "decline" of labor union power "responsible for 20%" of the Great Divergence. While the decline of union power in the US has been a factor in declining middle class incomes, they have retained their clout in Western Europe.
In Denmark, influential trade unions such as Fagligt Fælles Forbund (3F) ensure that fast-food workers earn a living wage, the equivalent of $20 an hour, which is more than double the hourly rate for their counterparts in the United States.
Critics of technological change as an explanation for the "Great Divergence" of income levels in America point to public policy and party politics, or "stuff the government did, or didn't do". They argue these have led to a trend of declining labor union membership rates and resulting diminishing political clout, decreased expenditure on social services, and less government redistribution. Moreover, the United States is the only advanced economy without a labor-based political party.
As of 2011, several state legislatures have launched initiatives aimed at lowering wages, labor standards, and workplace protections for both union and non-union workers.
The economist Joseph Stiglitz argues that "Strong unions have helped to reduce inequality, whereas weaker unions have made it easier for CEOs, sometimes working with market forces that they have helped shape, to increase it." The long fall in unionization in the U.S. since WWII has seen a corresponding rise in the inequality of wealth and income.
Political parties and presidents:
Liberal political scientist Larry Bartels has found a strong correlation between the party of the president and income inequality in America since 1948. (see below) Examining average annual pre-tax income growth from 1948 to 2005 (which encompassed most of the egalitarian Great Compression and the entire inegalitarian Great Divergence).
Bartels shows that under Democratic presidents (from Harry Truman forward), the greatest income gains have been at the bottom of the income scale and tapered off as income rose.
Under Republican presidents, in contrast, gains were much less but what growth there was concentrated towards the top, tapering off as you went down the income scale.
Summarizing Bartels's findings, journalist Timothy Noah referred to the administrations of Democratic presidents as "Democrat-world", and GOP administrations as "Republican-world":
In Democrat-world, pre-tax income increased 2.64% annually for the poor and lower-middle-class and 2.12% annually for the upper-middle-class and rich. There was no Great Divergence. Instead, the Great Compression – the egalitarian income trend that prevailed through the 1940s, 1950s, and 1960s – continued to the present, albeit with incomes converging less rapidly than before.
In Republican-world, meanwhile, pre-tax income increased 0.43 percent annually for the poor and lower-middle-class and 1.90 percent for the upper-middle-class and rich. Not only did the Great Divergence occur; it was more greatly divergent. Also of note: In Democrat-world pre-tax income increased faster than in the real world not just for the 20th percentile but also for the 40th, 60th, and 80th.
We were all richer and more equal! But in Republican-world, pre-tax income increased slower than in the real world not just for the 20th percentile but also for the 40th, 60th, and 80th. We were all poorer and less equal! Democrats also produced marginally faster income growth than Republicans at the 95th percentile, but the difference wasn't statistically significant.
The pattern of distribution of growth appears to be the result of a whole host of policies,
including not only the distribution of taxes and benefits but also the government's stance toward unions, whether the minimum wage rises, the extent to which the government frets about inflation versus too-high interest rates, etc., etc.
Noah admits the evidence of this correlation is "circumstantial rather than direct", but so is "the evidence that smoking is a leading cause of lung cancer."
In his 2017 book The Great Leveler, historian Walter Scheidel point out that, starting in the 1970s, both parties shifted towards promoting free market capitalism, with Republicans moving further to the political right than Democrats to the political left. He notes that Democrats have been instrumental in the financial deregulation of the 1990s and have largely neglected social welfare issues while increasingly focusing on issues pertaining to identity politics.
The Clinton Administration in particular continued promoting free market, or neoliberal, reforms which began under the Reagan Administration.
Non-party political action:
Further information: Executive pay in the United States
According to political scientists Jacob Hacker and Paul Pierson writing in the book Winner-Take-All Politics, the important policy shifts were brought on not by the Republican Party but by the development of a modern, efficient political system, especially lobbying, by top earners – and particularly corporate executives and the financial services industry.
The end of the 1970s saw a transformation of American politics away from a focus on the middle class, with new, much more effective, aggressive and well-financed lobbyists and pressure groups acting on behalf of upper income groups. Executives successfully eliminated any countervailing power or oversight of corporate managers (from private litigation, boards of directors and shareholders, the Securities and Exchange Commission or labor unions).
The financial industry's success came from successfully pushing for deregulation of financial markets, allowing much more lucrative but much more risky investments from which it privatized the gains while socializing the losses with government bailouts. (the two groups formed about 60% of the top 0.1 percent of earners.) All top earners were helped by deep cuts in estate and capital gains taxes, and tax rates on high levels of income.
Arguing against the proposition that the explosion in pay for corporate executives – which grew from 35X average worker pay in 1978 to over 250X average pay before the 2007 recession – is driven by an increased demand for scarce talent and set according to performance, Krugman points out that multiple factors outside of executives' control govern corporate profitability, particularly in short term when the head of a company like Enron may look like a great success.
Further, corporate boards follow other companies in setting pay even if the directors themselves disagree with lavish pay "partly to attract executives whom they consider adequate, partly because the financial market will be suspicious of a company whose CEO isn't lavishly paid." Finally "corporate boards, largely selected by the CEO, hire compensation experts, almost always chosen by the CEO" who naturally want to please their employers.
Lucian Arye Bebchuk, Jesse M. Fried, the authors of Pay Without Performance, critique of executive pay, argue that executive capture of corporate governance is so complete that only public relations, i.e. public `outrage`, constrains their pay. This in turn has been reduced as traditional critics of excessive pay – such as politicians (where need for campaign contributions from the richest outweighs populist indignation), media (lauding business genius), unions (crushed) – are now silent.
In addition to politics, Krugman postulated change in norms of corporate culture have played a factor. In the 1950s and 60s, corporate executives had (or could develop) the ability to pay themselves very high compensation through control of corporate boards of directors, they restrained themselves.
But by the end of the 1990s, the average real annual compensation of the top 100 C.E.O.'s skyrocketed from $1.3 million – 39 times the pay of an average worker – to $37.5 million, more than 1,000 times the pay of ordinary workers from 1982 to 2002.
Journalist George Packer also sees the dramatic increase in inequality in America as a product of the change in attitude of the American elite, which (in his view) has been transitioning itself from pillars of society to a special interest group.
Author Timothy Noah estimates that what he calls "Wall Street and corporate boards' pampering" of the highest earning 0.1% is "responsible for 30%" of the post-1978 increase in inequality.
Immigration:
The Immigration and Nationality Act of 1965 increased immigration to America, especially of non-Europeans. From 1970 to 2007, the foreign-born proportion of America's population grew from 5% to 11%, most of whom had lower education levels and incomes than native-born Americans.
But the contribution of this increase in supply of low-skill labor seem to have been relatively modest. One estimate stated that immigration reduced the average annual income of native-born "high-school dropouts" ("who roughly correspond to the poorest tenth of the workforce") by 7.4% from 1980 to 2000.
The decline in income of better educated workers was much less. Author Timothy Noah estimates that "immigration" is responsible for just 5% of the "Great Divergence" in income distribution, as does economist David Card.
While immigration was found to have slightly depressed the wages of the least skilled and least educated American workers, it doesn't explain rising inequality among high school and college graduates.
Scholars such as political scientists Jacob S. Hacker, Paul Pierson, Larry Bartels and Nathan Kelly, and economist Timothy Smeeding question the explanation of educational attainment and workplace skills point out that other countries with similar education levels and economies have not gone the way of the US, and that the concentration of income in the US hasn't followed a pattern of "the 29% of Americans with college degrees pulling away" from those who have less education.
Wage theft:
A September 2014 report by the Economic Policy Institute claims wage theft is also responsible for exacerbating income inequality: "Survey evidence suggests that wage theft is widespread and costs workers billions of dollars a year, a transfer from low-income employees to business owners that worsens income inequality, hurts workers and their families, and damages the sense of fairness and justice that a democracy needs to survive."
Corporatism:
See also: Corporatocracy
Edmund Phelps published an analysis in 2010 theorizing that the cause of income inequality is not free market capitalism, but instead is the result of the rise of corporatism.
Corporatism, in his view, is the antithesis of free market capitalism. It is characterized by semi-monopolistic organizations and banks, big employer confederations, often acting with complicit state institutions in ways that discourage (or block) the natural workings of a free economy.
The primary effects of corporatism are the consolidation of economic power and wealth with end results being the attrition of entrepreneurial and free market dynamism.
His follow-up book, Mass Flourishing, further defines corporatism by the following attributes:
- power-sharing between government and large corporations (exemplified in the U.S. by widening government power in areas such as financial services, healthcare, and energy through regulation),
- an expansion of corporate lobbying and campaign support in exchange for government reciprocity, escalation in the growth and influence of financial and banking sectors,
- increased consolidation of the corporate landscape through merger and acquisition (with ensuing increases in corporate executive compensation),
- increased potential for corporate/government corruption and malfeasance,
- and a lack of entrepreneurial and small business development leading to lethargic and stagnant economic conditions.
Today, in the United States, virtually all of these economic conditions are being borne out. With regard to income inequality, the 2014 income analysis of University of California, Berkeley economist Emmanuel Saez confirms that relative growth of income and wealth is not occurring among small and mid-sized entrepreneurs and business owners (who generally populate the lower half of top one per-centers in income), but instead only among the top .1 percent of income distribution ... whom Paul Krugman describes as "super-elites - corporate bigwigs and financial wheeler-dealers." ... who earn $2,000,000 or more every year.
For example, measured relative to GDP, total compensation and its component wages and salaries have been declining since 1970. This indicates a shift in income from labor (persons who derive income from hourly wages and salaries) to capital (persons who derive income via ownership of businesses, land and assets).
Wages and salaries have fallen from approximately 51% GDP in 1970 to 43% GDP in 2013. Total compensation has fallen from approximately 58% GDP in 1970 to 53% GDP in 2013.
To put this in perspective, five percent of U.S. GDP was approximately $850 billion in 2013. This represents an additional $7,000 in wages and salaries for each of the 120 million U.S. households. Larry Summers estimated in 2007 that the lower 80% of families were receiving $664 billion less income than they would be with a 1979 income distribution (a period of much greater equality), or approximately $7,000 per family.
Not receiving this income may have led many families to increase their debt burden, a significant factor in the 2007-2009 subprime mortgage crisis, as highly leveraged homeowners suffered a much larger reduction in their net worth during the crisis. Further, since lower income families tend to spend relatively more of their income than higher income families, shifting more of the income to wealthier families may slow economic growth.
In another example, The Economist propounds that a swelling corporate financial and banking sector has caused Gini Coefficients to rise in the U.S. since 1980: "Financial services' share of GDP in America doubled to 8% between 1980 and 2000; over the same period their profits rose from about 10% to 35% of total corporate profits, before collapsing in 2007–09. Bankers are being paid more, too. In America the compensation of workers in financial services was similar to average compensation until 1980. Now it is twice that average."
The summary argument, considering these findings, is that if corporatism is the consolidation and sharing of economic and political power between large corporations and the state ... then a corresponding concentration of income and wealth (with resulting income inequality) is an expected by-product of such a consolidation.
Neoliberalism:
See also: Market fundamentalism
Some economists, sociologists and anthropologists argue that neoliberalism, or the resurgence of 19th century theories relating to laissez-faire economic liberalism in the late 1970s, has been the significant driver of inequality.
More broadly, according to The Handbook of Neoliberalism, the term has "become a means of identifying a seemingly ubiquitous set of market-oriented policies as being largely responsible for a wide range of social, political, ecological and economic problems."
Vicenç Navarro points to policies pertaining to the deregulation of labor markets, privatization of public institutions, union busting and reduction of public social expenditures as contributors to this widening disparity.
The privatization of public functions, for example, grows income inequality by depressing wages and eliminating benefits for middle class workers while increasing income for those at the top. The deregulation of the labor market undermined unions by allowing the real value of the minimum wage to plummet, resulting in employment insecurity and widening wage and income inequality.
David M. Kotz, professor of economics at the University of Massachusetts Amherst, contends that neoliberalism "is based on the thorough domination of labor by capital." As such, the advent of the neoliberal era has seen a sharp increase in income inequality through the decline of unionization, stagnant wages for workers and the rise of CEO supersalaries.
According to Emmanuel Saez: The labor market has been creating much more inequality over the last thirty years, with the very top earners capturing a large fraction of macroeconomic productivity gains.
A number of factors may help explain this increase in inequality, not only underlying technological changes but also the retreat of institutions developed during the New Deal and World War II - such as progressive tax policies, powerful unions, corporate provision of health and retirement benefits, and changing social norms regarding pay inequality.
Pennsylvania State University political science professor Pamela Blackmon attributes the trends of growing poverty and income inequality to the convergence of several neoliberal policies during Ronald Reagan's presidency, including the decreased funding of education, decreases in the top marginal tax rates, and shifts in transfer programs for those in poverty.
Journalist Mark Bittman echoes this sentiment in a 2014 piece for The New York Times:
"The progress of the last 40 years has been mostly cultural, culminating, the last couple of years, in the broad legalization of same-sex marriage. But by many other measures, especially economic, things have gotten worse, thanks to the establishment of neo-liberal principles — anti-unionism, deregulation, market fundamentalism and intensified, unconscionable greed — that began with Richard Nixon and picked up steam under Ronald Reagan. Too many are suffering now because too few were fighting then."
Fred L. Block and Margaret Somers, in expanding on Karl Polanyi's critique of laissez-faire theories in The Great Transformation, argue that Polanyi's analysis helps to explain why the revival of such ideas has contributed to the "persistent unemployment, widening inequality, and the severe financial crises that have stressed Western economies over the past forty years."
John Schmitt and Ben Zipperer of the Center for Economic and Policy Research also point to economic liberalism as one of the causes of income inequality. They note that European nations, in particular the social democracies of Northern Europe with extensive and well funded welfare states, have lower levels of income inequality and social exclusion than the United States.
See also:
- Marginal revenue productivity theory of wages
- Wealth inequality in the United States
- Corporatocracy#United States
- CBO-Trends in the Distribution of Household Income Between 1979-2007-October 25, 2011
- CBO-Trends in the Distribution of Household Income and Federal Taxes, 2011 - November 12, 2014
- Emmanuel Saez-Striking it Richer-September 3, 2013
Capitalism or Socialism?
but opinion is not overwhelmingly negative (Monmouth University 5/6/2019)
- YouTube Video from the Movie "Wall Street": Gordon Gekko "Greed is Good"
- YouTube Video: Why Capitalism is Better than Socialism
- YouTube Video: Does Capitalism Exploit Workers?
but opinion is not overwhelmingly negative (Monmouth University 5/6/2019)
[Your Web Host: during the 2020 elections, President Trump and his Republican allies attempted to paint a negative picture of Democrats as seeking Socialism, e.g., socialized medicine. For the purposes of this web page, below you will find the Wikipedia articles on each for your own understanding. My Take: Capitalism is much better to the average citizen than socialism, although capital markets must be regulated to avoid the worst human trait capitalism can cause: greed!]
Capitalism is an economic system based on the private ownership of the means of production and their operation for profit.
Characteristics central to capitalism include private property, capital accumulation, wage labor, voluntary exchange, a price system and competitive markets.
In a capitalist market economy, decision-making and investments are determined by every owner of wealth, property or production ability in financial and capital markets, whereas prices and the distribution of goods and services are mainly determined by competition in goods and services markets.
Economists, political economists, sociologists and historians have adopted different perspectives in their analyses of capitalism and have recognized various forms of it in practice. These include laissez-faire or free-market capitalism, welfare capitalism and state capitalism.
Different forms of capitalism feature varying degrees of free markets, public ownership, obstacles to free competition and state-sanctioned social policies. The degree of competition in markets, the role of intervention and regulation, and the scope of state ownership vary across different models of capitalism.
The extent to which different markets are free as well as the rules defining private property are matters of politics and policy. Most existing capitalist economies are mixed economies which combine elements of free markets with state intervention and in some cases economic planning.
Market economies have existed under many forms of government and in many different times, places and cultures. Modern capitalist societies—marked by a universalization of money-based social relations, a consistently large and system-wide class of workers who must work for wages, and a capitalist class which owns the means of production—developed in Western Europe in a process that led to the Industrial Revolution.
Capitalist systems with varying degrees of direct government intervention have since become dominant in the Western world and continue to spread. Over time, capitalist countries have experienced consistent economic growth and an increase in the standard of living.
Critics of capitalism argue that it establishes power in the hands of a minority capitalist class that exists through the exploitation of the majority working class and their labor; prioritizes profit over social good, natural resources and the environment; and is an engine of inequality, corruption and economic instabilities.
Supporters argue that it provides better products and innovation through competition, disperses wealth to all productive people, promotes pluralism and decentralization of power, creates strong economic growth and yields productivity and prosperity that greatly benefit society.
Background:
The term "capitalist", meaning an owner of capital, appears earlier than the term "capitalism" and it dates back to the mid-17th century. "Capitalism" is derived from capital, which evolved from capitale, a late Latin word based on caput, meaning "head"—also the origin of "chattel" and "cattle" in the sense of movable property (only much later to refer only to livestock).
Capitale emerged in the 12th to 13th centuries in the sense of referring to funds, stock of merchandise, sum of money or money carrying interest. By 1283, it was used in the sense of the capital assets of a trading firm and it was frequently interchanged with a number of other words—wealth, money, funds, goods, assets, property and so on.
The Hollandische Mercurius uses "capitalists" in 1633 and 1654 to refer to owners of capital. In French, Étienne Clavier referred to capitalistes in 1788, six years before its first recorded English usage by Arthur Young in his work Travels in France (1792).
In his Principles of Political Economy and Taxation (1817), David Ricardo referred to "the capitalist" many times. Samuel Taylor Coleridge, an English poet, used "capitalist" in his work Table Talk (1823). Pierre-Joseph Proudhon used the term "capitalist" in his first work, What is Property? (1840), to refer to the owners of capital. Benjamin Disraeli used the term "capitalist" in his 1845 work Sybil.
The initial usage of the term "capitalism" in its modern sense has been attributed to Louis Blanc in 1850 ("What I call 'capitalism' that is to say the appropriation of capital by some to the exclusion of others") and Pierre-Joseph Proudhon in 1861 ("Economic and social regime in which capital, the source of income, does not generally belong to those who make it work through their labor").
Karl Marx and Friedrich Engels referred to the "capitalistic system" and to the "capitalist mode of production" in Capital (1867). The use of the word "capitalism" in reference to an economic system appears twice in Volume I of Capital, p. 124 (German edition) and in Theories of Surplus Value, tome II, p. 493 (German edition).
Marx did not extensively use the form capitalism, but instead those of capitalist and capitalist mode of production, which appear more than 2,600 times in the trilogy The Capital. According to the Oxford English Dictionary (OED), the term "capitalism" first appeared in English in 1854 in the novel The Newcomes by novelist William Makepeace Thackeray, where he meant "having ownership of capital". Also according to the OED, Carl Adolph Douai, a German American socialist and abolitionist, used the phrase "private capitalism" in 1863.
Click on any of the following blue hyperlinks for more about Capitalism:
Socialism is a range of economic and social systems characterized by social ownership of the means of production and workers' self-management, as well as the political theories and movements associated with them.
Social ownership can be public, collective or cooperative ownership, or citizen ownership of equity. There are many varieties of socialism and there is no single definition encapsulating all of them, with social ownership being the common element shared by its various forms.
Socialist systems are divided into non-market and market forms.
Non-market socialism involves replacing factor markets and money with engineering and technical criteria based on calculation performed in-kind, thereby producing an economic mechanism that functions according to different economic laws from those of capitalism (above).
Non-market socialism aims to circumvent the inefficiencies and crises traditionally associated with capital accumulation and the profit system. By contrast, market socialism retains the use of monetary prices, factor markets and in some cases the profit motive, with respect to the operation of socially owned enterprises and the allocation of capital goods between them.
Profits generated by these firms would be controlled directly by the workforce of each firm, or accrue to society at large in the form of a social dividend. The socialist calculation debate concerns the feasibility and methods of resource allocation for a socialist system.
Socialist politics has been both internationalist and nationalist in orientation; organised through political parties and opposed to party politics; at times overlapping with trade unions, and at other times independent and critical of unions; and present in both industrialized and developing nations.
Originating within the socialist movement, social democracy has embraced a mixed economy with a market that includes substantial state intervention in the form of income redistribution, regulation, and a welfare state. Economic democracy proposes a sort of market socialism where there is more decentralised control of companies, currencies, investments, and natural resources.
The socialist political movement includes a set of political philosophies that originated in the revolutionary movements of the mid-to-late 18th century and out of concern for the social problems that were associated with capitalism. By the late 19th century, after the work of Karl Marx and his collaborator Friedrich Engels, socialism had come to signify opposition to capitalism and advocacy for a post-capitalist system based on some form of social ownership of the means of production.
By the 1920s, social democracy and communism had become the two dominant political tendencies within the international socialist movement. By this time, socialism emerged as "the most influential secular movement of the twentieth century, worldwide.
It is a political ideology (or world view), a wide and divided political movement" and while the emergence of the Soviet Union as the world's first nominally socialist state led to socialism's widespread association with the Soviet economic model, some economists and intellectuals argued that in practice the model functioned as a form of state capitalism or a non-planned administrative or command economy.
Socialist parties and ideas remain a political force with varying degrees of power and influence on all continents, heading national governments in many countries around the world. Today, some socialists have also adopted the causes of other social movements, such as environmentalism, feminism and progressivism.
For Andrew Vincent, "[t]he word ‘socialism’ finds its root in the Latin sociare, which means to combine or to share. The related, more technical term in Roman and then medieval law was societas. This latter word could mean companionship and fellowship as well as the more legalistic idea of a consensual contract between freemen".
The term "socialism" was created by Henri de Saint-Simon, one of the founders of what would later be labelled "utopian socialism". Simon coined the term as a contrast to the liberal doctrine of "individualism", which stressed that people act or should act as if they are in isolation from one another.
The original "utopian" socialists condemned liberal individualism for failing to address social concerns during the industrial revolution, including poverty, social oppression and gross inequalities in wealth, thus viewing liberal individualism as degenerating society into supporting selfish egoism that harmed community life through promoting a society based on competition.
They presented socialism as an alternative to liberal individualism based on the shared ownership of resources, although their proposals for socialism differed significantly. Saint-Simon proposed economic planning, scientific administration and the application of modern scientific advancements to the organisation of society.
By contrast, Robert Owen proposed the organization of production and ownership in cooperatives.
The term "socialism" is also attributed to Pierre Leroux and to Marie Roch Louis Reybaud in France; and to Robert Owen in Britain who became one of the fathers of the cooperative movement.
The modern definition and usage of "socialism" settled by the 1860s, becoming the predominant term among the group of words "co-operative", "mutualist" and "associationist", which had previously been used as synonyms. The term "communism" also fell out of use during this period, despite earlier distinctions between socialism and communism from the 1840s.
An early distinction between socialism and communism was that the former aimed to only socialise production while the latter aimed to socialise both production and consumption (in the form of free access to final goods). However, Marxists employed the term "socialism" in place of "communism" by 1888, which had come to be considered an old-fashion synonym for socialism.
It was not until 1917 after the Bolshevik Revolution that "socialism" came to refer to a distinct stage between capitalism and communism, introduced by Vladimir Lenin as a means to defend the Bolshevik seizure of power against traditional Marxist criticisms that Russia's productive forces were not sufficiently developed for socialist revolution.
A distinction between "communist" and "socialist" as descriptors of political ideologies arose in 1918 after the Russian Social-Democratic Labour Party renamed itself to the All-Russian Communist Party, where communist came to specifically mean socialists who supported the politics and theories of Leninism, Bolshevism and later Marxism–Leninism, although communist parties continued to describe themselves as socialists dedicated to socialism.
The words "socialism" and "communism" eventually accorded with the adherents' and opponents' cultural attitude towards religion. In Christian Europe, communism was believed to be the atheist way of life. In Protestant England, the word "communism" was too culturally and aurally close to the Roman Catholic communion rite, hence English atheists denoted themselves socialists.
Friedrich Engels argued that in 1848, at the time when The Communist Manifesto was published, that "socialism was respectable on the continent, while communism was not".
The Owenites in England and the Fourierists in France were considered "respectable" socialists, while working-class movements that "proclaimed the necessity of total social change" denoted themselves communists. This latter branch of socialism produced the communist work of Étienne Cabet in France and Wilhelm Weitling in Germany.
The British moral philosopher John Stuart Mill also came to advocate a form of economic socialism within a liberal context. In later editions of his Principles of Political Economy (1848), Mill would argue that "as far as economic theory was concerned, there is nothing in principle in economic theory that precludes an economic order based on socialist policies".
While democrats looked to the Revolutions of 1848 as a democratic revolution, which in the long run ensured liberty, equality and fraternity, Marxists denounced 1848 as a betrayal of working-class ideals by a bourgeoisie indifferent to the legitimate demands of the proletariat.
Click on any of the following blue hyperlinks for more about Socialism:
Capitalism is an economic system based on the private ownership of the means of production and their operation for profit.
Characteristics central to capitalism include private property, capital accumulation, wage labor, voluntary exchange, a price system and competitive markets.
In a capitalist market economy, decision-making and investments are determined by every owner of wealth, property or production ability in financial and capital markets, whereas prices and the distribution of goods and services are mainly determined by competition in goods and services markets.
Economists, political economists, sociologists and historians have adopted different perspectives in their analyses of capitalism and have recognized various forms of it in practice. These include laissez-faire or free-market capitalism, welfare capitalism and state capitalism.
Different forms of capitalism feature varying degrees of free markets, public ownership, obstacles to free competition and state-sanctioned social policies. The degree of competition in markets, the role of intervention and regulation, and the scope of state ownership vary across different models of capitalism.
The extent to which different markets are free as well as the rules defining private property are matters of politics and policy. Most existing capitalist economies are mixed economies which combine elements of free markets with state intervention and in some cases economic planning.
Market economies have existed under many forms of government and in many different times, places and cultures. Modern capitalist societies—marked by a universalization of money-based social relations, a consistently large and system-wide class of workers who must work for wages, and a capitalist class which owns the means of production—developed in Western Europe in a process that led to the Industrial Revolution.
Capitalist systems with varying degrees of direct government intervention have since become dominant in the Western world and continue to spread. Over time, capitalist countries have experienced consistent economic growth and an increase in the standard of living.
Critics of capitalism argue that it establishes power in the hands of a minority capitalist class that exists through the exploitation of the majority working class and their labor; prioritizes profit over social good, natural resources and the environment; and is an engine of inequality, corruption and economic instabilities.
Supporters argue that it provides better products and innovation through competition, disperses wealth to all productive people, promotes pluralism and decentralization of power, creates strong economic growth and yields productivity and prosperity that greatly benefit society.
Background:
The term "capitalist", meaning an owner of capital, appears earlier than the term "capitalism" and it dates back to the mid-17th century. "Capitalism" is derived from capital, which evolved from capitale, a late Latin word based on caput, meaning "head"—also the origin of "chattel" and "cattle" in the sense of movable property (only much later to refer only to livestock).
Capitale emerged in the 12th to 13th centuries in the sense of referring to funds, stock of merchandise, sum of money or money carrying interest. By 1283, it was used in the sense of the capital assets of a trading firm and it was frequently interchanged with a number of other words—wealth, money, funds, goods, assets, property and so on.
The Hollandische Mercurius uses "capitalists" in 1633 and 1654 to refer to owners of capital. In French, Étienne Clavier referred to capitalistes in 1788, six years before its first recorded English usage by Arthur Young in his work Travels in France (1792).
In his Principles of Political Economy and Taxation (1817), David Ricardo referred to "the capitalist" many times. Samuel Taylor Coleridge, an English poet, used "capitalist" in his work Table Talk (1823). Pierre-Joseph Proudhon used the term "capitalist" in his first work, What is Property? (1840), to refer to the owners of capital. Benjamin Disraeli used the term "capitalist" in his 1845 work Sybil.
The initial usage of the term "capitalism" in its modern sense has been attributed to Louis Blanc in 1850 ("What I call 'capitalism' that is to say the appropriation of capital by some to the exclusion of others") and Pierre-Joseph Proudhon in 1861 ("Economic and social regime in which capital, the source of income, does not generally belong to those who make it work through their labor").
Karl Marx and Friedrich Engels referred to the "capitalistic system" and to the "capitalist mode of production" in Capital (1867). The use of the word "capitalism" in reference to an economic system appears twice in Volume I of Capital, p. 124 (German edition) and in Theories of Surplus Value, tome II, p. 493 (German edition).
Marx did not extensively use the form capitalism, but instead those of capitalist and capitalist mode of production, which appear more than 2,600 times in the trilogy The Capital. According to the Oxford English Dictionary (OED), the term "capitalism" first appeared in English in 1854 in the novel The Newcomes by novelist William Makepeace Thackeray, where he meant "having ownership of capital". Also according to the OED, Carl Adolph Douai, a German American socialist and abolitionist, used the phrase "private capitalism" in 1863.
Click on any of the following blue hyperlinks for more about Capitalism:
- History
- Varieties of capitalism
- Characteristics
- Supply and demand
- Role of government
- Types of capitalism
- Capital accumulation
- Wage labor
- Effects of war
- Criticism
- See also:
- Capitalism (disambiguation)
- Communism
- Animal industrial complex
- Anti-capitalism
- Christian views on poverty and wealth
- Corporatocracy
- Crony capitalism
- Economic sociology
- Free Market
- Humanistic economics
- Invisible hand
- Late capitalism
- Le Livre noir du capitalisme – 1998 French book (The Black Book of Capitalism)
- Market socialism
- Perspectives on capitalism by school of thought
- Post-capitalism
- Post-Fordism
- Rent-seeking
- State monopoly capitalism
- Sustainable capitalism
- Global financial crisis in September 2008
- Capitalism on In Our Time at the BBC
- Selected Titles on Capitalism and Its Discontents. Harvard University Press.
Socialism is a range of economic and social systems characterized by social ownership of the means of production and workers' self-management, as well as the political theories and movements associated with them.
Social ownership can be public, collective or cooperative ownership, or citizen ownership of equity. There are many varieties of socialism and there is no single definition encapsulating all of them, with social ownership being the common element shared by its various forms.
Socialist systems are divided into non-market and market forms.
Non-market socialism involves replacing factor markets and money with engineering and technical criteria based on calculation performed in-kind, thereby producing an economic mechanism that functions according to different economic laws from those of capitalism (above).
Non-market socialism aims to circumvent the inefficiencies and crises traditionally associated with capital accumulation and the profit system. By contrast, market socialism retains the use of monetary prices, factor markets and in some cases the profit motive, with respect to the operation of socially owned enterprises and the allocation of capital goods between them.
Profits generated by these firms would be controlled directly by the workforce of each firm, or accrue to society at large in the form of a social dividend. The socialist calculation debate concerns the feasibility and methods of resource allocation for a socialist system.
Socialist politics has been both internationalist and nationalist in orientation; organised through political parties and opposed to party politics; at times overlapping with trade unions, and at other times independent and critical of unions; and present in both industrialized and developing nations.
Originating within the socialist movement, social democracy has embraced a mixed economy with a market that includes substantial state intervention in the form of income redistribution, regulation, and a welfare state. Economic democracy proposes a sort of market socialism where there is more decentralised control of companies, currencies, investments, and natural resources.
The socialist political movement includes a set of political philosophies that originated in the revolutionary movements of the mid-to-late 18th century and out of concern for the social problems that were associated with capitalism. By the late 19th century, after the work of Karl Marx and his collaborator Friedrich Engels, socialism had come to signify opposition to capitalism and advocacy for a post-capitalist system based on some form of social ownership of the means of production.
By the 1920s, social democracy and communism had become the two dominant political tendencies within the international socialist movement. By this time, socialism emerged as "the most influential secular movement of the twentieth century, worldwide.
It is a political ideology (or world view), a wide and divided political movement" and while the emergence of the Soviet Union as the world's first nominally socialist state led to socialism's widespread association with the Soviet economic model, some economists and intellectuals argued that in practice the model functioned as a form of state capitalism or a non-planned administrative or command economy.
Socialist parties and ideas remain a political force with varying degrees of power and influence on all continents, heading national governments in many countries around the world. Today, some socialists have also adopted the causes of other social movements, such as environmentalism, feminism and progressivism.
For Andrew Vincent, "[t]he word ‘socialism’ finds its root in the Latin sociare, which means to combine or to share. The related, more technical term in Roman and then medieval law was societas. This latter word could mean companionship and fellowship as well as the more legalistic idea of a consensual contract between freemen".
The term "socialism" was created by Henri de Saint-Simon, one of the founders of what would later be labelled "utopian socialism". Simon coined the term as a contrast to the liberal doctrine of "individualism", which stressed that people act or should act as if they are in isolation from one another.
The original "utopian" socialists condemned liberal individualism for failing to address social concerns during the industrial revolution, including poverty, social oppression and gross inequalities in wealth, thus viewing liberal individualism as degenerating society into supporting selfish egoism that harmed community life through promoting a society based on competition.
They presented socialism as an alternative to liberal individualism based on the shared ownership of resources, although their proposals for socialism differed significantly. Saint-Simon proposed economic planning, scientific administration and the application of modern scientific advancements to the organisation of society.
By contrast, Robert Owen proposed the organization of production and ownership in cooperatives.
The term "socialism" is also attributed to Pierre Leroux and to Marie Roch Louis Reybaud in France; and to Robert Owen in Britain who became one of the fathers of the cooperative movement.
The modern definition and usage of "socialism" settled by the 1860s, becoming the predominant term among the group of words "co-operative", "mutualist" and "associationist", which had previously been used as synonyms. The term "communism" also fell out of use during this period, despite earlier distinctions between socialism and communism from the 1840s.
An early distinction between socialism and communism was that the former aimed to only socialise production while the latter aimed to socialise both production and consumption (in the form of free access to final goods). However, Marxists employed the term "socialism" in place of "communism" by 1888, which had come to be considered an old-fashion synonym for socialism.
It was not until 1917 after the Bolshevik Revolution that "socialism" came to refer to a distinct stage between capitalism and communism, introduced by Vladimir Lenin as a means to defend the Bolshevik seizure of power against traditional Marxist criticisms that Russia's productive forces were not sufficiently developed for socialist revolution.
A distinction between "communist" and "socialist" as descriptors of political ideologies arose in 1918 after the Russian Social-Democratic Labour Party renamed itself to the All-Russian Communist Party, where communist came to specifically mean socialists who supported the politics and theories of Leninism, Bolshevism and later Marxism–Leninism, although communist parties continued to describe themselves as socialists dedicated to socialism.
The words "socialism" and "communism" eventually accorded with the adherents' and opponents' cultural attitude towards religion. In Christian Europe, communism was believed to be the atheist way of life. In Protestant England, the word "communism" was too culturally and aurally close to the Roman Catholic communion rite, hence English atheists denoted themselves socialists.
Friedrich Engels argued that in 1848, at the time when The Communist Manifesto was published, that "socialism was respectable on the continent, while communism was not".
The Owenites in England and the Fourierists in France were considered "respectable" socialists, while working-class movements that "proclaimed the necessity of total social change" denoted themselves communists. This latter branch of socialism produced the communist work of Étienne Cabet in France and Wilhelm Weitling in Germany.
The British moral philosopher John Stuart Mill also came to advocate a form of economic socialism within a liberal context. In later editions of his Principles of Political Economy (1848), Mill would argue that "as far as economic theory was concerned, there is nothing in principle in economic theory that precludes an economic order based on socialist policies".
While democrats looked to the Revolutions of 1848 as a democratic revolution, which in the long run ensured liberty, equality and fraternity, Marxists denounced 1848 as a betrayal of working-class ideals by a bourgeoisie indifferent to the legitimate demands of the proletariat.
Click on any of the following blue hyperlinks for more about Socialism:
- History
- Contemporary socialist politics
- Social and political theory
- Economics
- Politics
- Criticism
- See also:
- List of anti-capitalist and communist parties with national parliamentary representation
- List of communist ideologies
- List of socialist countries
- List of socialist economists
- Socialism by country
- Socialism at Curlie.
- "Socialism". Internet Encyclopedia of Philosophy.
- Cuban Socialism from the Dean Peter Krogh Foreign Affairs Digital Archives.
- Cole, G.D.H. (1922). "Socialism" . Encyclopædia Britannica (12th ed.).
- Ely, Richard T.; Adams, Thomas Sewall (1905). "Socialism" . New International Encyclopedia.
White Americans are Americans who are considered or reported as White. The United States Census Bureau defines White people as those "having origins in any of the original peoples of Europe, the Middle East-North Africa."
Like all official U.S. racial categories, "White" has a "not Hispanic or Latino" and a "Hispanic or Latino" component, the latter consisting mostly of White Mexican Americans and White Cuban Americans. The term "Caucasian" is often used interchangeably with "White", although the terms are not synonymous.
The largest ancestries of American Whites are:
However, the English-Americans and British-Americans demography is considered a serious under-count as the stock tend to self-report and identify as simply 'Americans' (6.9%), due to the length of time they have inhabited America.
Whites (including Hispanics who identify as White) constitute the majority, with a total of about 246,660,710, or 77.35% of the population as of 2014. Non-Hispanic Whites totaled about 197,870,516, or 62.06% of the U.S. population.
Click on any of the following blue hyperlinks for more about White Americans in the USA:
Like all official U.S. racial categories, "White" has a "not Hispanic or Latino" and a "Hispanic or Latino" component, the latter consisting mostly of White Mexican Americans and White Cuban Americans. The term "Caucasian" is often used interchangeably with "White", although the terms are not synonymous.
The largest ancestries of American Whites are:
- German Americans (16.5%),
- Irish Americans (11.9%),
- English Americans (9.2%),
- Welsh Americans (6%),
- Scots-Irish Americans (12%),
- Italian Americans (5.5%),
- Mexican Americans (5.4%),
- French Americans (4%),
- Polish Americans (3%),
- Scottish Americans (1.9%),
- Dutch Americans (1.6%),
- Finnish Americans (2%),
- Danish Americans (0.5%),
- Norwegian Americans (1.5%),
- and Swedish Americans (1.4%).
However, the English-Americans and British-Americans demography is considered a serious under-count as the stock tend to self-report and identify as simply 'Americans' (6.9%), due to the length of time they have inhabited America.
Whites (including Hispanics who identify as White) constitute the majority, with a total of about 246,660,710, or 77.35% of the population as of 2014. Non-Hispanic Whites totaled about 197,870,516, or 62.06% of the U.S. population.
Click on any of the following blue hyperlinks for more about White Americans in the USA:
- Historical and present definitions
- Demographic information
- Population by state or territory
- Culture
- Admixture
- See also:
- Anglo
- Emigration from Europe
- European Americans
- Europhobia
- Hyphenated American
- Middle Eastern Americans
- Non-Hispanic Whites
- Race and ethnicity in the United States
- Stereotypes of White Americans
- White Anglo-Saxon Protestant
- White ethnic
- White Hispanic and Latino Americans
- White Southerners
- White Population 2000 from the US Census
Hispanic and Latino Americans, including a List by Profession
- YouTube Video: How Latino Americans Shaped the U.S., Fought for Acceptance
- YouTube Video: Famous Hispanic Americans
- YouTube Video: I’m Mexican. Does that change your assumptions about me? | Vanessa Vancour | TEDxUniversityofNevada
Click here for a List of Hispanic and Latino Americans by Profession
Hispanic Americans and Latin Americans are Americans who are descendants of the Spanish- countries of Latin America and Spain. More generally, it includes all persons in the United States who self-identify as Hispanic or Latino, whether of full or partial ancestry.
For the 2010 United States Census, people counted as "Hispanic" or "Latino" were those who identified as one of the specific Hispanic or Latino categories listed on the census questionnaire ("Mexican," "Puerto Rican," or "Cuban") as well as those who indicated that they were "other Spanish, Hispanic, or Latino."
The national origins classified as Hispanic or Latino by the United States Census Bureau are the following:
Other U.S. government agencies have slightly different definitions of the term, including Brazilians and other Portuguese-speaking groups. The Census Bureau uses the terms Hispanic and Latino interchangeably.
"Origin" can be viewed as the ancestry, nationality group, lineage, or country of birth of the person or the person's parents or ancestors before their arrival in the United States. People who identify as Spanish, Hispanic, or Latino may be of any race. As the only specifically designated category of ethnicity in the United States (other than non-Hispanic/Latino), Hispanics form a pan-ethnicity incorporating a diversity of inter-related cultural and linguistic heritages.
Most Hispanic Americans are of Mexican, Puerto Rican, Cuban, Salvadoran, Dominican, Guatemalan, or Colombian origin. The predominant origin of regional Hispanic populations varies widely in different locations across the country.
Hispanic Americans are the second fastest-growing ethnic group in the United States after Asian Americans. Hispanic/Latinos overall are the second-largest ethnic group in the United States, after non-Hispanic Whites (a group which, like Hispanics and Latinos, is composed of dozens of sub-groups of differing national origin).
Hispanics have lived within what is now the United States continuously since the founding of St. Augustine by the Spanish in 1565. After Native Americans, Hispanics are the oldest ethnic group to inhabit much of what is today the United States. Many have Native American ancestry. Spain colonized large areas of what is today the American Southwest and West Coast, as well as Florida. Its holdings included present-day California, New Mexico, Nevada, Arizona, and Texas, all of which were part of the Republic of Mexico from its independence in 1821 until the end of the Mexican–American War in 1848.
A study published in 2015 in the American Journal of Human Genetics, based on 23andMe data from 8,663 self-described Latinos, estimated that Latinos in the United States carried a mean on 65.1% European ancestry, 18.0% Native American ancestry, and 6.2% African ancestry. The study found that self-described Latinos from the Southwest, especially those along the Mexican border, had the highest mean levels of Native American ancestry, while self-described Latinos from the South, Midwest, and Atlantic Coast had the highest mean levels of African ancestry.
Click on any of the following blue hyperlinks for more about Hispanic and Latino Americans:
Hispanic Americans and Latin Americans are Americans who are descendants of the Spanish- countries of Latin America and Spain. More generally, it includes all persons in the United States who self-identify as Hispanic or Latino, whether of full or partial ancestry.
For the 2010 United States Census, people counted as "Hispanic" or "Latino" were those who identified as one of the specific Hispanic or Latino categories listed on the census questionnaire ("Mexican," "Puerto Rican," or "Cuban") as well as those who indicated that they were "other Spanish, Hispanic, or Latino."
The national origins classified as Hispanic or Latino by the United States Census Bureau are the following:
- Spanish,
- Argentine,
- Cuban,
- Colombian,
- Puerto Rican,
- Mexican,
- Dominican,
- Costa Rican,
- Guatemalan,
- Honduran,
- Nicaraguan,
- Panamanian,
- Salvadoran,
- Bolivian,
- Chilean,
- Ecuadorian,
- Paraguayan,
- Peruvian,
- Uruguayan,
- and Venezuelan.
Other U.S. government agencies have slightly different definitions of the term, including Brazilians and other Portuguese-speaking groups. The Census Bureau uses the terms Hispanic and Latino interchangeably.
"Origin" can be viewed as the ancestry, nationality group, lineage, or country of birth of the person or the person's parents or ancestors before their arrival in the United States. People who identify as Spanish, Hispanic, or Latino may be of any race. As the only specifically designated category of ethnicity in the United States (other than non-Hispanic/Latino), Hispanics form a pan-ethnicity incorporating a diversity of inter-related cultural and linguistic heritages.
Most Hispanic Americans are of Mexican, Puerto Rican, Cuban, Salvadoran, Dominican, Guatemalan, or Colombian origin. The predominant origin of regional Hispanic populations varies widely in different locations across the country.
Hispanic Americans are the second fastest-growing ethnic group in the United States after Asian Americans. Hispanic/Latinos overall are the second-largest ethnic group in the United States, after non-Hispanic Whites (a group which, like Hispanics and Latinos, is composed of dozens of sub-groups of differing national origin).
Hispanics have lived within what is now the United States continuously since the founding of St. Augustine by the Spanish in 1565. After Native Americans, Hispanics are the oldest ethnic group to inhabit much of what is today the United States. Many have Native American ancestry. Spain colonized large areas of what is today the American Southwest and West Coast, as well as Florida. Its holdings included present-day California, New Mexico, Nevada, Arizona, and Texas, all of which were part of the Republic of Mexico from its independence in 1821 until the end of the Mexican–American War in 1848.
A study published in 2015 in the American Journal of Human Genetics, based on 23andMe data from 8,663 self-described Latinos, estimated that Latinos in the United States carried a mean on 65.1% European ancestry, 18.0% Native American ancestry, and 6.2% African ancestry. The study found that self-described Latinos from the Southwest, especially those along the Mexican border, had the highest mean levels of Native American ancestry, while self-described Latinos from the South, Midwest, and Atlantic Coast had the highest mean levels of African ancestry.
Click on any of the following blue hyperlinks for more about Hispanic and Latino Americans:
- Terminology
- History
- Demographics
- Youth
- Notable contributions
- Education
- Cultural influence
- Politics
- Cultural issues
- Hispanophobia
- Relations with other minority groups
- See also:
- Places of settlement in United States:
- List of U.S. communities with Hispanic majority populations in the 2010 census
- List of U.S. cities with large Hispanic population
- List of U.S. cities by Spanish-speaking population
- Hispanic and Latino Americans in California
- Hispanic and Latino Americans in Arizona
- Hispanic and Latino Americans in New Mexico
- Hispanic and Latino Americans in Texas
- Hispanic and Latino Americans in Nevada
- Hispanic and Latino Americans in Florida
- Diaspora:
- Individuals:
- Other Hispanic and Latino Americans topics:
- General:
- 2000 Census
- Hispanic Americans in Congress -- Library of Congress
- Hispanic Americans in the U.S. Army
- Latino-Americans Become Unofficial Face of Politics Abroad by Josh Miller, PBS, April 27, 2007
- Latino in America - CNN
- Mexican American News - Xcano Media
- Places of settlement in United States:
African Americans including a List of the 100 Greatest African Americans along with Afro-American History
- YouTube Video of "Still I Rise" by Maya Angelou*
- *- Maya Angelou
- YouTube Video: Roots (2016) Mini-Series Full Trailer
- YouTube Video: The Color Purple (1985) Official Trailer - Oprah Winfrey, Steven Spielberg Movie HD
100 Greatest African Americans is a biographical dictionary of one hundred historically great Black Americans (in no particular order; that is, they are not ranked), as assessed by Temple University professor Molefi Kete Asante in 2002.
Asante used five factors in establishing the list:
African Americans (also referred to as Black Americans or Afro-Americans) are an ethnic group of Americans with total or partial ancestry from any of the Black racial groups of Africa. The term may also be used to include only those individuals who are descended from enslaved Africans. As a compound adjective the term is usually hyphenated as African-American.
Black and African Americans constitute the third largest racial and ethnic group in the United States (after White Americans and Hispanic and Latino Americans). Most African Americans are of West and Central African descent and are descendants of enslaved peoples within the boundaries of the present United States.
The vast majority of African Americans also have some European and Native American ancestry. According to US Census Bureau data, African immigrants generally do not self-identify as African American. The overwhelming majority of African immigrants identify instead with their own respective ethnicities (~95%). Immigrants from some Caribbean, Central American and South American nations and their descendants may or may not also self-identify with the term.
African-American history starts in the 16th century, with peoples from West Africa forcibly taken as slaves to Spanish America, and in the 17th century with West African slaves taken to English colonies in North America. After the founding of the United States, black people continued to be enslaved, with four million denied freedom from bondage prior to the Civil War.
Believed to be inferior to white people, they were treated as second-class citizens. The Naturalization Act of 1790 limited U.S. citizenship to whites only, and only white men of property could vote.
These circumstances were changed by Reconstruction, development of the black community, participation in the great military conflicts of the United States, the elimination of racial segregation, and the Civil Rights Movement which sought political and social freedom. In 2008, Barack Obama became the first African American to be elected President of the United States.
Click on any of the following blue hyperlinks for more about African Americans:
African-American history:
African-American history is the part of American history that looks at the history of African Americans or Black Americans.
Of the 10.7 million Africans who were brought to the Americas by white Europeans until the 1880s, 450 thousand were shipped to what is now the United States.
Click on any of the following blue hyperlinks for more about African-American History:
Asante used five factors in establishing the list:
- "significance in the general progress of African-Americans toward full equality in the American social and political system"
- "self-sacrifice and a willingness to take great risks for the collective good"
- "unusual will and determination in the face of great danger and against the most stubborn odds"
- "a consistent posture toward raising the social, cultural and economic status of African Americans"
- "personal achievement that reveals the best qualities of the African American people"
African Americans (also referred to as Black Americans or Afro-Americans) are an ethnic group of Americans with total or partial ancestry from any of the Black racial groups of Africa. The term may also be used to include only those individuals who are descended from enslaved Africans. As a compound adjective the term is usually hyphenated as African-American.
Black and African Americans constitute the third largest racial and ethnic group in the United States (after White Americans and Hispanic and Latino Americans). Most African Americans are of West and Central African descent and are descendants of enslaved peoples within the boundaries of the present United States.
The vast majority of African Americans also have some European and Native American ancestry. According to US Census Bureau data, African immigrants generally do not self-identify as African American. The overwhelming majority of African immigrants identify instead with their own respective ethnicities (~95%). Immigrants from some Caribbean, Central American and South American nations and their descendants may or may not also self-identify with the term.
African-American history starts in the 16th century, with peoples from West Africa forcibly taken as slaves to Spanish America, and in the 17th century with West African slaves taken to English colonies in North America. After the founding of the United States, black people continued to be enslaved, with four million denied freedom from bondage prior to the Civil War.
Believed to be inferior to white people, they were treated as second-class citizens. The Naturalization Act of 1790 limited U.S. citizenship to whites only, and only white men of property could vote.
These circumstances were changed by Reconstruction, development of the black community, participation in the great military conflicts of the United States, the elimination of racial segregation, and the Civil Rights Movement which sought political and social freedom. In 2008, Barack Obama became the first African American to be elected President of the United States.
Click on any of the following blue hyperlinks for more about African Americans:
- History
- Demographics
- Religion
- Business
- Language
- Genetics
- Traditional names
- Contemporary issues
- Politics and social issues
- News media and coverage
- Culture in the United States
- Terminology
- Notable people
- See also:
- African American art
- African-American business history
- African-American Civil Rights Movement (1865–95)
- African-American Civil Rights Movement (1896–1954)
- Timeline of the African-American Civil Rights Movement (1954–68)
- African-American literature
- African-American middle class
- African-American music
- African-American names
- African American National Biography Project
- African-American neighborhood
- African-American upper class
- African American Vernacular English
- Afrophobia
- Anglo-African term
- Back-to-Africa movement
- Black feminism
- Black History Month
- Black Lives Matter
- Black Loyalist
- Military history of African Americans
- National Museum of African American History and Culture
- Scientific racism
- Stereotypes of African Americans
- Diaspora:
- Lists:
- Index of articles related to African Americans
- List of historically black colleges and universities
- List of topics related to the African diaspora
- List of populated places in the United States with African-American plurality populations
- List of U.S. states by African-American population
- List of U.S. counties with African-American majority populations in 2000
- List of U.S. metropolitan areas with large African-American populations
- List of U.S. cities with large African-American populations
- List of U.S. communities with African-American majority populations in 2010
- List of African-American neighborhoods
- List of black college football classics
- Terminology:
- Richard Thompson Ford Name Games, Slate, September 16, 2004. Article discussing the problems of defining African American
- "Of Arms & the Law: Don Kates on Afro-American Homicide Rates"
- "The Definition of Political Absurdity", San Francisco Chronicle, March 2, 2007
- African American archaeology in Sacramento, California pdf
- African American archaeology in Oakland, California —See Part III, Chap 10
- Black History related original documents and photos
- Frank Newport, "Black or African American?", Gallup, September 28, 2007
- "The Long Journey of Black Americans" – slideshow by The First Post
African-American history:
African-American history is the part of American history that looks at the history of African Americans or Black Americans.
Of the 10.7 million Africans who were brought to the Americas by white Europeans until the 1880s, 450 thousand were shipped to what is now the United States.
Click on any of the following blue hyperlinks for more about African-American History:
- Enslavement
- Early African-American history
- The Revolution and early America
- Religion
- The antebellum period
- The American Civil War, Emancipation
- Reconstruction
- Jim Crow, disenfranchisement and challenges
- Civil rights
- The Great Migration and the Harlem Renaissance
- African-American businesses
- World War I
- New Deal
- World War II
- Second Great Migration
- Civil Rights Movement
- Post Civil Rights era of African-American history
- Historiography
- See also:
- African-American Heritage Sites
- African-American history of agriculture in the United States
- African American Historic Places
- Black genocide – the notion that African Americans have been subjected to genocide
- List of monuments to African Americans
- Lynching in the United States
- Mass racial violence in the United States
- Racial segregation in the United States
- Racism in the United States
- Slavery in the United States
- Timeline of African-American history
- List of museums focused on African Americans
- History of Africa
- African diaspora
- Civil Rights Movement:
- Civil rights movement (1896–1954)
- Civil rights movement in popular culture
- Timeline of the civil rights movement
- 19th Century African-American civil rights activists
- Black school
- Nadir of American race relations
- By state:
This list is incomplete; you can help by expanding it.- African Americans in Alabama
- African Americans in Florida
- African Americans in Georgia
- African Americans in Kansas
- African Americans in Louisiana
- African Americans in Maryland
- African Americans in Mississippi
- African Americans in North Carolina
- African Americans in South Carolina
- African Americans in Tennessee
- African Americans in Texas
- African Americans in Utah
- In other regions:
- African Americans in Atlanta
- African Americans in New York City
- African Americans in Omaha, Nebraska
- Civil rights movement in Omaha, Nebraska
- Black Belt (region of Chicago)
- Black Belt (region of Alabama)
- Black history in Puerto Rico
- History of African Americans in Boston
- History of African Americans in Chicago
- History of African Americans in Dallas-Ft. Worth
- History of African Americans in Detroit
- History of African Americans in Houston
- History of African Americans in Philadelphia
- History of African Americans in San Antonio
- A daily look into the great events and people in African American history
- Pioneering African American oral history video excerpts at The National Visionary Leadership Project
- Black History Daily - 365 days of Black History
- African-American history connection
- "Africans in America" - PBS 4-Part Series (2007)
- PBS Red Hand flag Episode 2008
- Living Black History: How Reimagining the African-American Past Can Remake America's Racial Future by Dr. Manning Marable (2006)
- Library of Congress - African American History and Culture
- Library of Congress - African American Odyssey
- Center for Contemporary Black History at Columbia University
- Encyclopædia Britannica - Guide to Black History
- Missouri State Archives - African-American History Initiative
- Black History Month
- "Remembering Jim Crow" - Minnesota Public Radio (multi-media)
- Educational Toys focused on African-American History, History in Action Toys
- "Slavery and the Making of America" - PBS - WNET, New York (4-part series)
- Timeline of Slavery in America
- Tennessee Technological University - African-American History and Studies
- "They Closed Our Schools", the story of Massive Resistance and the closing of the Prince Edward County, Virginia public schools
- Black People in History
- Comparative status of African-Americans in Canada in the 1800s
- Historical resources related to African American history provided free for public use by the State Archives of Florida
- USF Africana Project A guide to African-American genealogy
- Research African-American Records at the National Archives
- Memphis Civil Rights Digital Archive
- Black History Milestones
- African-American Collection, McLean County Museum of History
Asian Americans in the United States Pictured Below: Two Charts indicating Population Makeup of Asian Americans in the United States
Asian Americans are Americans of Asian descent. The term refers to a panethnic group that includes diverse populations who have ancestral origins in East Asia, Southeast Asia, or South Asia, as defined by the U.S. Census Bureau. This includes people who indicate their race(s) on the census as "Asian" or reported entries such as "Asian Indian, Chinese, Filipino, Korean, Japanese, Vietnamese, and Other Asian." Asian Americans with no other ancestry comprise 4.8% of the U.S. population, while people who are Asian alone or combined with at least one other race make up 5.6%.
Although migrants from Asia have been in parts of the contemporary United States since the 17th century, large-scale immigration did not begin until the mid-18th century. Nativist immigration laws during the 1880s-1920s excluded various Asian groups, eventually prohibiting almost all Asian immigration to the continental United States.
After immigration laws were reformed during the 1940s-60s, abolishing national origins quotas, Asian immigration increased rapidly. Analyses of the 2010 census have shown that Asian Americans are the fastest growing racial or ethnic minority in the United States.
Starting in the first few years of the 2000 decade, Asian American earnings began exceeding all other racial groups for both men and women. For example, in 2008 Asian Americans had the highest median household income overall of any racial demographic.
In 2012, Asian Americans had the highest educational attainment level and median household income of any racial demographic in the country. In 2015, Asian American men were the highest earning racial group as they earned 117% as much as white American men and Asian-American women earned 106% as much as white American women.
Despite this, a 2014 report from the Census Bureau reported that 12% of Asian Americans were living below the poverty line, while only 10.1% of non-Hispanic white Americans live below the poverty line. Once country of birth and other demographic factors are taken into account, Asian Americans are no more likely than non-Hispanic whites to live in poverty.
Click on any of the following blue hyperlinks for more about Asian Americans:
Although migrants from Asia have been in parts of the contemporary United States since the 17th century, large-scale immigration did not begin until the mid-18th century. Nativist immigration laws during the 1880s-1920s excluded various Asian groups, eventually prohibiting almost all Asian immigration to the continental United States.
After immigration laws were reformed during the 1940s-60s, abolishing national origins quotas, Asian immigration increased rapidly. Analyses of the 2010 census have shown that Asian Americans are the fastest growing racial or ethnic minority in the United States.
Starting in the first few years of the 2000 decade, Asian American earnings began exceeding all other racial groups for both men and women. For example, in 2008 Asian Americans had the highest median household income overall of any racial demographic.
In 2012, Asian Americans had the highest educational attainment level and median household income of any racial demographic in the country. In 2015, Asian American men were the highest earning racial group as they earned 117% as much as white American men and Asian-American women earned 106% as much as white American women.
Despite this, a 2014 report from the Census Bureau reported that 12% of Asian Americans were living below the poverty line, while only 10.1% of non-Hispanic white Americans live below the poverty line. Once country of birth and other demographic factors are taken into account, Asian Americans are no more likely than non-Hispanic whites to live in poverty.
Click on any of the following blue hyperlinks for more about Asian Americans:
- Terminology
- Demographics
- History
- Notable people
- Cultural influence
- Social and political issues
- Progress as a group within American society
- See also:
- Amerasian
- Asian Pacific American Heritage Month
- Asian American and Pacific Islander Policy Research Consortium
- Asian American studies
- Asian Americans in New York City
- Asian Hispanic and Latino Americans
- Asian Latino
- Asian Pacific American
- Asian pride
- Hyphenated American
- Jade Ribbon Campaign
- Index of Asian American-related articles
- UCLA Asian American Studies Center
- Asian-Nation Asian American History, Culture, Statistics, & Issues
- Korean Americans in America – National organizations, business directory, job posts and news
- U.S. Asian Population, Census 2000, infoplease.com.
- Video: Panel Discussion on 'Asian Americans Changing the Landscape' Asia Society, New York, May 19, 2010
Native Americans in the United States
Pictured: Location of Native American Tribes in the Contiguous United States (all states except Alaska and Hawaii)
- YouTube Video: America's Great Indian Nations
- YouTube Video: 25 Little Known Facts About Native Americans
- YouTube Video: Native American museum opens at Smithsonian*
Pictured: Location of Native American Tribes in the Contiguous United States (all states except Alaska and Hawaii)
In the United States, Native Americans (also known as American Indians, Indigenous Americans or simply Indians; see §Terminology differences) are people who belong to one of the over 500 distinct Native American tribes that survive intact today as partially sovereign nations within the country's modern boundaries. These tribes and bands are descended from the pre-Columbian indigenous population of North America.
Click on any of the following blue hyperlinks for more about Native Americans:
Click on any of the following blue hyperlinks for more about Native Americans:
- Background
- History, including:
- Demographics
- Current legal status
- Contemporary issues
- Society, language, and culture
- About
- See also:
- Federally recognized tribes
- (Federally) unrecognized tribes
- List of Alaska Native tribal entities
- List of Indian reservations in the United States
- List of historical Indian reservations in the United States
- List of notable Native Americans of the United States
- List of notable writers from peoples indigenous to the Americas
- National Park Service Native American Heritage Sites
- Native American mythology
- Native Americans in popular culture
- Outline of United States federal Indian law and policy
- State recognized tribes in the United States
- Indigenous peoples of the Americas
- Sexual victimization of native American women
- Native Americans in the United States at DMOZ
- Government:
- Organizations and media:
- National Congress of American Indians official website - National Congress of American Indians
- Indian Country Today Media Network official website - Indian Country Today Media Network
- First Nations Experience (FNX) - multi-media platform that is a partnership between the San Manuel Band of Mission Indians and KVCR, a PBS member station located in California's Inland Empire
- Academic collections and other resources:
- American Indian Records in the National Archives from the National Archives and Records Administration: National Museum of the American Indian official website - National Museum of the American Indian, part of the Smithsonian Institution
- National Indian Law Library of the Native American Rights Fund - a law library of federal Indian and tribal law
- 1904–1924 'The North American Indian' - collection of Edward Sheriff Curtis photographs
- Southeastern Native American Documents, 1730–1842 - online collection from several archives, museums, and librarys
- Bonneville Collection of 19th-century photographs of Native Americans, University of South Carolina Library's Digital Collections Page
- Selected treaties from the Avalon Project of Yale Law School's Lillian Goldman Law Library
Affirmative Action in the United States
- YouTube Video: This is how affirmative action began
- YouTube Video: What we get wrong about affirmative action
- YouTube Video: Is Affirmative Action Discriminatory? by The View (CBS)
Affirmative action in the United States is a set of laws, policies, guidelines, and administrative practices "intended to end and correct the effects of a specific form of discrimination." These include government-mandated, government-sanctioned, and voluntary private programs that tend to focus on access to education and employment, specifically granting special consideration to historically excluded groups such as racial minorities or women.
The impetus toward affirmative action is redressing the disadvantages associated with past and present discrimination. Further impetus is a desire to ensure public institutions, such as universities, hospitals, and police forces, are more representative of the populations they serve.
In the United States, affirmative action tends to emphasize not specific quotas but rather "targeted goals" to address past discrimination in a particular institution or in broader society through "good-faith efforts ... to identify, select, and train potentially qualified minorities and women." For example, many higher education institutions have voluntarily adopted policies which seek to increase recruitment of racial minorities. Another example is executive orders requiring some government contractors and subcontractors to adopt equal opportunity employment measures, such as outreach campaigns, targeted recruitment, employee and management development, and employee support programs.
Affirmative action policies were developed to address long histories of discrimination faced by minorities and women, which reports suggest produced corresponding unfair advantages for whites and males.
They first emerged from debates over non-discrimination policies in the 1940s and during the Civil Rights Movement. These debates led to federal executive orders requiring non-discrimination in the employment policies of some government agencies and contractors in the 1940s onwards, and to Title VII of the Civil Rights Act of 1964 which prohibited racial discrimination in firms with over 25 employees.
The first federal policy of race-conscious affirmative action was the Revised Philadelphia Plan, which required government contractors to set "goals and timetables" for integrating and diversifying their workforce.
Similar policies began to emerge through a mix of voluntary practices and federal and state policies in employment and education. Affirmative action as a practice was partially upheld by the Supreme Court in Grutter v. Bollinger (2003), while the use of racial or gender quotas for college admissions was concurrently ruled unconstitutional by the Court in Gratz v. Bollinger (2003).
Affirmative action is a subject of controversy in American politics. Opponents of affirmative action argue that these policies are outdated and lead to reverse discrimination which entails favoring one group over another based upon racial preference rather than achievement, and many believe that the diversity of current American society suggests that affirmative action policies succeeded and are no longer required.
Some policies adopted as affirmative action, such as racial quotas or gender quotas, have been criticized as a form of reverse discrimination. Scholars have also questioned whether quota systems and "targeted goals" can be clearly distinguished from each other.
Click on any of the following blue hyperlinks for more about Affirmative Action in the United States:
The impetus toward affirmative action is redressing the disadvantages associated with past and present discrimination. Further impetus is a desire to ensure public institutions, such as universities, hospitals, and police forces, are more representative of the populations they serve.
In the United States, affirmative action tends to emphasize not specific quotas but rather "targeted goals" to address past discrimination in a particular institution or in broader society through "good-faith efforts ... to identify, select, and train potentially qualified minorities and women." For example, many higher education institutions have voluntarily adopted policies which seek to increase recruitment of racial minorities. Another example is executive orders requiring some government contractors and subcontractors to adopt equal opportunity employment measures, such as outreach campaigns, targeted recruitment, employee and management development, and employee support programs.
Affirmative action policies were developed to address long histories of discrimination faced by minorities and women, which reports suggest produced corresponding unfair advantages for whites and males.
They first emerged from debates over non-discrimination policies in the 1940s and during the Civil Rights Movement. These debates led to federal executive orders requiring non-discrimination in the employment policies of some government agencies and contractors in the 1940s onwards, and to Title VII of the Civil Rights Act of 1964 which prohibited racial discrimination in firms with over 25 employees.
The first federal policy of race-conscious affirmative action was the Revised Philadelphia Plan, which required government contractors to set "goals and timetables" for integrating and diversifying their workforce.
Similar policies began to emerge through a mix of voluntary practices and federal and state policies in employment and education. Affirmative action as a practice was partially upheld by the Supreme Court in Grutter v. Bollinger (2003), while the use of racial or gender quotas for college admissions was concurrently ruled unconstitutional by the Court in Gratz v. Bollinger (2003).
Affirmative action is a subject of controversy in American politics. Opponents of affirmative action argue that these policies are outdated and lead to reverse discrimination which entails favoring one group over another based upon racial preference rather than achievement, and many believe that the diversity of current American society suggests that affirmative action policies succeeded and are no longer required.
Some policies adopted as affirmative action, such as racial quotas or gender quotas, have been criticized as a form of reverse discrimination. Scholars have also questioned whether quota systems and "targeted goals" can be clearly distinguished from each other.
Click on any of the following blue hyperlinks for more about Affirmative Action in the United States:
- History
- Legal history
- Arguments in favor of affirmative action
- Arguments against affirmative action
- Implementation in universities
- See also
Poverty in the United States including Low Income Communities along with Growing up in the Projects
TOP: the Pruitt-Igoe Public Housing in St. Louis;
BOTTOM: Comparison of Poverty Rates by Age, 1959 and 2015
- YouTube Video: Mapping poverty in America | The Economist
- YouTube Video: A Portrait of Poverty in America: Job Insecurity and Payday Lending
- YouTube Video of Income inequality: Hunger down the block from wealth
TOP: the Pruitt-Igoe Public Housing in St. Louis;
BOTTOM: Comparison of Poverty Rates by Age, 1959 and 2015
Click here for a List of lowest-income places in the United States
Poverty is a state of deprivation, lacking the usual or socially acceptable amount of money or material possessions. The most common measure of poverty in the U.S. is the "poverty threshold" set by the U.S. government. This measure recognizes poverty as a lack of those goods and services commonly taken for granted by members of mainstream society. The official threshold is adjusted for inflation using the consumer price index.
Most Americans will spend at least one year below the poverty line at some point between ages 25 and 75. Poverty rates are persistently higher in rural and inner city parts of the country as compared to suburban areas.
In 2015, 13.5% (43.1 million) Americans lived in poverty. Starting in the 1930s, relative poverty rates have consistently exceeded those of other wealthy nations. The lowest poverty rates are found in New Hampshire, Vermont, Minnesota and Nebraska, which have between 8.7% and 9.1% of their population living in poverty.
In 2009 the number of people who were in poverty was approaching 1960s levels that led to the national War on Poverty. In 2011 extreme poverty in the United States, meaning households living on less than $2 per day before government benefits, was double 1996 levels at 1.5 million households, including 2.8 million children.
In 2012 the percentage of seniors living in poverty was 14% while 18% of children were. The addition of Social Security benefits contributed more to reduce poverty than any other factor.
Recent census data shows that half the population qualifies as poor or low income, with one in five Millennials living in poverty. Academic contributors to The Routledge Handbook of Poverty in the United States postulate that new and extreme forms of poverty have emerged in the U.S. as a result of neoliberal structural adjustment policies and globalization, which have rendered economically marginalized communities as destitute "surplus populations" in need of control and punishment.
In 2011, child poverty reached record high levels, with 16.7 million children living in food insecure households, about 35% more than 2007 levels. A 2013 UNICEF report ranked the U.S. as having the second highest relative child poverty rates in the developed world.
According to a 2016 study by the Urban Institute, teenagers in low income communities are often forced to join gangs, save school lunches, sell drugs or exchange sexual favors because they cannot afford food.
There were about 643,000 sheltered and un-sheltered homeless people nationwide in January 2009. Almost two-thirds stayed in an emergency shelter or transitional housing program and the other third were living on the street, in an abandoned building, or another place not meant for human habitation.
About 1.56 million people, or about 0.5% of the U.S. population, used an emergency shelter or a transitional housing program between October 1, 2008 and September 30, 2009. Around 44% of homeless people are employed.
In June 2016, the IMF warned the United States that its high poverty rate needs to be tackled urgently by raising the minimum wage and offering paid maternity leave to women to encourage them to enter the labor force.
Click on any of the following blue hyperlinks for more about Poverty in the United States:
Public housing in the United States is administered by federal, state and local agencies to provide subsidized assistance for low-income households. Public housing is priced well below the market rate, allowing people to live in more convenient locations rather than move away from the city in search of lower rents.
Now increasingly provided in a variety of settings and formats, originally public housing in the U.S. consisted primarily of one or more concentrated blocks of low-rise and/or high-rise apartment buildings. These complexes are operated by state and local housing authorities which are authorized and funded by the United States Department of Housing and Urban Development. More than 1.2 million households currently live in public housing of some type.
Subsidized apartment buildings, often referred to as housing projects or colloquially "the projects", have a complicated and often notorious history in the United States. While the first decades of projects were built with higher construction standards and a broader range of incomes and applicants, over time, public housing increasingly became the housing of last resort in many cities.
Several reasons have been cited for this negative trend including the failure of Congress to provide sufficient funding, a lowering of standards for occupancy, and mismanagement at the local level. Furthermore, housing projects have also been seen to greatly increase concentrated poverty in a community, leading to several negative externalities.
Crime, drug usage, and educational underperformance are all widely associated with housing projects, particularly in urban areas.
As a result of their various problems and diminished political support, many of the traditional low-income public housing properties constructed in the earlier years of the program have been demolished.
Beginning primarily in the 1970s the federal government turned to other approaches including the Section 8 project-based program, Section 8 certificates, and the Housing Choice Voucher Program.
In the 1990s the federal government accelerated the transformation of traditional public housing through HUD's HOPE VI Program. Hope VI funds are used to tear down distressed public housing projects and replace them with mixed communities constructed in cooperation with private partners.
In 2012, Congress and HUD initiated a new program called the Rental Assistance Demonstration (RAD) program. Under the demonstration program, eligible public housing properties are redeveloped in conjunction with private developers and investors.
Click on any of the following blue hyperlinks for more about Public Housing in the United States:
Poverty is a state of deprivation, lacking the usual or socially acceptable amount of money or material possessions. The most common measure of poverty in the U.S. is the "poverty threshold" set by the U.S. government. This measure recognizes poverty as a lack of those goods and services commonly taken for granted by members of mainstream society. The official threshold is adjusted for inflation using the consumer price index.
Most Americans will spend at least one year below the poverty line at some point between ages 25 and 75. Poverty rates are persistently higher in rural and inner city parts of the country as compared to suburban areas.
In 2015, 13.5% (43.1 million) Americans lived in poverty. Starting in the 1930s, relative poverty rates have consistently exceeded those of other wealthy nations. The lowest poverty rates are found in New Hampshire, Vermont, Minnesota and Nebraska, which have between 8.7% and 9.1% of their population living in poverty.
In 2009 the number of people who were in poverty was approaching 1960s levels that led to the national War on Poverty. In 2011 extreme poverty in the United States, meaning households living on less than $2 per day before government benefits, was double 1996 levels at 1.5 million households, including 2.8 million children.
In 2012 the percentage of seniors living in poverty was 14% while 18% of children were. The addition of Social Security benefits contributed more to reduce poverty than any other factor.
Recent census data shows that half the population qualifies as poor or low income, with one in five Millennials living in poverty. Academic contributors to The Routledge Handbook of Poverty in the United States postulate that new and extreme forms of poverty have emerged in the U.S. as a result of neoliberal structural adjustment policies and globalization, which have rendered economically marginalized communities as destitute "surplus populations" in need of control and punishment.
In 2011, child poverty reached record high levels, with 16.7 million children living in food insecure households, about 35% more than 2007 levels. A 2013 UNICEF report ranked the U.S. as having the second highest relative child poverty rates in the developed world.
According to a 2016 study by the Urban Institute, teenagers in low income communities are often forced to join gangs, save school lunches, sell drugs or exchange sexual favors because they cannot afford food.
There were about 643,000 sheltered and un-sheltered homeless people nationwide in January 2009. Almost two-thirds stayed in an emergency shelter or transitional housing program and the other third were living on the street, in an abandoned building, or another place not meant for human habitation.
About 1.56 million people, or about 0.5% of the U.S. population, used an emergency shelter or a transitional housing program between October 1, 2008 and September 30, 2009. Around 44% of homeless people are employed.
In June 2016, the IMF warned the United States that its high poverty rate needs to be tackled urgently by raising the minimum wage and offering paid maternity leave to women to encourage them to enter the labor force.
Click on any of the following blue hyperlinks for more about Poverty in the United States:
- Measures of poverty
- Poverty and demographics
- Poverty and education
- Food security
- Factors in poverty
- Concerns regarding accuracy
- Fighting poverty
- See also:
- Income in the United States
- Income inequality in the United States
- Income deficit
- List of U.S. states by poverty rate
- Lowest-income counties in the United States
- Homelessness in the United States
- Hunger in the United States
- Poor person
- Social programs in the United States
- Pathways out of Poverty (POP)
- U.S. Census Bureau Poverty Definition
- U.S. Census Bureau Poverty in the United States
- Child Poverty and Tax: a simple graph of child disposable income disparity in OECD countries against tax burdens.
- F.H.C. Ministries Charity is not Reform!
- From Poverty to Prosperity: A National Strategy to Cut Poverty in Half, The Center for American Progress, April 2007.
- Explanation of poverty definition by economist Ellen Frank in Dollars & Sense magazine, January/February 2006
- "Deciding Who's Poor" by economist Barbara Bergmann in Dollars & Sense magazine, March/April 2000
- 37 million poor hidden in the land of plenty
- David Walls, Models of Poverty and Planned Change
- U.S. Government Does Relatively Little to Lessen Child Poverty Rates
- U.S. Department of Health & Human Services Poverty Guidelines, Research, and Measurement
- Cities Tolerate Homeless Camps by Jennifer Levitz, The Wall Street Journal, August 11, 2009
- The Forgotten Americans PBS series by Hector Galan about colonias.
- Americans living in Third World conditions This article discusses the living conditions of people inhabiting colonias (with pictures).
- Steve Suitts, "The Worst of Times: Children in Extreme Poverty in the South and Nation," Southern Spaces, June 29, 2010.
- 80 Percent Of U.S. Adults Face Near-Poverty, Unemployment: Survey--Huffington Post, July 28, 2013
- The American Way of Poverty: As Inequality Hits Record High, Sasha Abramsky on the Forgotten Poor. DemocracyNow! September 12, 2013.
- America's Shameful Poverty Stats, Sasha Abramsky. The Nation, September 18, 2013.
- How Much Money to End Poverty in America? Truthdig. September 26, 2013.
- Poverty in the United States: 2012 Congressional Research Service
- It Is Expensive to Be Poor. The Atlantic. January 13, 2014.
- Here's The Painful Truth About What It Means To Be 'Working Poor' In America. The Huffington Post, May 19, 2014.
- 10 Poverty Myths, Busted. Mother Jones, March/April 2014 issue.
- FPL Calculator, A mobile app for calculating federal poverty level.
- The Poor Get Prison. Institute for Policy Studies, 2015.
- Poverty research on IssueLab.
- Other:
- Human Poverty Index
- Mississippi Teacher Corps
- Basic Income
- Negative Income Tax
- Tipping Point Community
- Redistributive change
- De-industrialization crisis
- The Other America
- Two Americas
- Kids Against Hunger
- Can you hear their voices? (1931 play)
- Feminization of poverty
- Unintended pregnancy
- Social determinants of health in poverty
- International:
Public housing in the United States is administered by federal, state and local agencies to provide subsidized assistance for low-income households. Public housing is priced well below the market rate, allowing people to live in more convenient locations rather than move away from the city in search of lower rents.
Now increasingly provided in a variety of settings and formats, originally public housing in the U.S. consisted primarily of one or more concentrated blocks of low-rise and/or high-rise apartment buildings. These complexes are operated by state and local housing authorities which are authorized and funded by the United States Department of Housing and Urban Development. More than 1.2 million households currently live in public housing of some type.
Subsidized apartment buildings, often referred to as housing projects or colloquially "the projects", have a complicated and often notorious history in the United States. While the first decades of projects were built with higher construction standards and a broader range of incomes and applicants, over time, public housing increasingly became the housing of last resort in many cities.
Several reasons have been cited for this negative trend including the failure of Congress to provide sufficient funding, a lowering of standards for occupancy, and mismanagement at the local level. Furthermore, housing projects have also been seen to greatly increase concentrated poverty in a community, leading to several negative externalities.
Crime, drug usage, and educational underperformance are all widely associated with housing projects, particularly in urban areas.
As a result of their various problems and diminished political support, many of the traditional low-income public housing properties constructed in the earlier years of the program have been demolished.
Beginning primarily in the 1970s the federal government turned to other approaches including the Section 8 project-based program, Section 8 certificates, and the Housing Choice Voucher Program.
In the 1990s the federal government accelerated the transformation of traditional public housing through HUD's HOPE VI Program. Hope VI funds are used to tear down distressed public housing projects and replace them with mixed communities constructed in cooperation with private partners.
In 2012, Congress and HUD initiated a new program called the Rental Assistance Demonstration (RAD) program. Under the demonstration program, eligible public housing properties are redeveloped in conjunction with private developers and investors.
Click on any of the following blue hyperlinks for more about Public Housing in the United States:
- History
- Social issues
- Alternative models
- City programs
- See also:
- List of public housing developments in the United States
- People:
- Harold Harby (1894–1978), Los Angeles, California, City Council member whose vote switch killed public housing in that city
- General:
Immigration Law in the United States, including a List of Immigration Laws
- YouTube Video: The Statue of Liberty: Building an Icon
- YouTube Video: "Tour of a lifetime" of Statue of Liberty
- YouTube Video: Becoming a U.S. Citizen: What You Need to Know (by USA.Gov)
Click here for a List of Immigration Laws in the United States
Immigration law refers to national government policies controlling the immigration and deportation of people, and other matters such as citizenship.
Immigration law regarding the citizens of a country is regulated by international law. The United Nations International Covenant on Civil and Political Rights mandates that all countries allow entry to its own citizens.
Certain countries may maintain rather strict laws which regulate both the right of entry and internal rights, such as the duration of stay and the right to participate in government. Most countries have laws which designate a process for naturalization, by which foreigners may become citizens.
Immigration Law in the United States:
During the colonial period, independent colonies created their own immigration laws. The first law governing the naturalization of foreigners was the Naturalization Act of 1790. The 1882 Chinese Exclusion Act was passed to stop the immigration of Chinese people.
The Emergency Quota Act of 1921 and the Immigration Act of 1924 put a quota on how many immigrants were permitted, based on nationality and the numbers of persons who had immigrated in previous years. The Immigration and Nationality Act of 1952 led to the creation of the Immigration and Naturalization Service.
The Department of Homeland Security, which replaced the Immigration and Naturalization Service, enforces immigration laws. The United States allows more than 1 million undocumented immigrants to become Legal Permanent Residents every year. The United States also issues more Visas than any other country in the world.
Visas in the United States can be broadly separated into two categories: immigrant visas, and non-immigrant visas. The former are subject to "per country-caps", whereas the latter is not. Most non-immigrant visas are for work purposes, and usually require an offer of employment from a US employer. Such immigration may involve restrictions such as a labor certification to ensure that no American workers are able to fill the role of the job.
Other categories include student, family and tourist visas. Each visa category is further divided into numerous subcategories; the large number of specific categories has been recommended as a main area for comprehensive immigration reform.
George W. Bush advocated for immigration amnesty in 2007 and sent a bill to senate where it was shot down.
Obama took a different approach, on the other hand, instead of moving directly to senate with an amnesty bill, he worked to override and to get rid of many immigration enforcement policies already in place. While there were fewer removals and returns under the Obama administration than each of the two prior administrations, those declines must be understood against the backdrop of a significant reduction in border apprehensions that resulted from a sharp decrease in unauthorized inflows, in particular of Mexicans. (Click here for more about Obama's policies).
Click on any of the following blue hyperlinks for more about Immigration Law, including in the United States:
Immigration law refers to national government policies controlling the immigration and deportation of people, and other matters such as citizenship.
Immigration law regarding the citizens of a country is regulated by international law. The United Nations International Covenant on Civil and Political Rights mandates that all countries allow entry to its own citizens.
Certain countries may maintain rather strict laws which regulate both the right of entry and internal rights, such as the duration of stay and the right to participate in government. Most countries have laws which designate a process for naturalization, by which foreigners may become citizens.
Immigration Law in the United States:
During the colonial period, independent colonies created their own immigration laws. The first law governing the naturalization of foreigners was the Naturalization Act of 1790. The 1882 Chinese Exclusion Act was passed to stop the immigration of Chinese people.
The Emergency Quota Act of 1921 and the Immigration Act of 1924 put a quota on how many immigrants were permitted, based on nationality and the numbers of persons who had immigrated in previous years. The Immigration and Nationality Act of 1952 led to the creation of the Immigration and Naturalization Service.
The Department of Homeland Security, which replaced the Immigration and Naturalization Service, enforces immigration laws. The United States allows more than 1 million undocumented immigrants to become Legal Permanent Residents every year. The United States also issues more Visas than any other country in the world.
Visas in the United States can be broadly separated into two categories: immigrant visas, and non-immigrant visas. The former are subject to "per country-caps", whereas the latter is not. Most non-immigrant visas are for work purposes, and usually require an offer of employment from a US employer. Such immigration may involve restrictions such as a labor certification to ensure that no American workers are able to fill the role of the job.
Other categories include student, family and tourist visas. Each visa category is further divided into numerous subcategories; the large number of specific categories has been recommended as a main area for comprehensive immigration reform.
George W. Bush advocated for immigration amnesty in 2007 and sent a bill to senate where it was shot down.
Obama took a different approach, on the other hand, instead of moving directly to senate with an amnesty bill, he worked to override and to get rid of many immigration enforcement policies already in place. While there were fewer removals and returns under the Obama administration than each of the two prior administrations, those declines must be understood against the backdrop of a significant reduction in border apprehensions that resulted from a sharp decrease in unauthorized inflows, in particular of Mexicans. (Click here for more about Obama's policies).
Click on any of the following blue hyperlinks for more about Immigration Law, including in the United States:
- Control measures
- Comparison of immigration visa categories by country or territory
- Comparison table of different countries' immigration law
- See also
Immigration to the United States including Illegal Immigration and its Illegal Immigration Population
- YouTube Video: More Than 100 Immigrants Sworn In As New U.S. Citizens
- YouTube Video: Why Migrants at the U.S. Border Are Becoming More Desperate
- YouTube Video: Texas residents fed up with surge of illegal immigrants
Immigration to the United States is the international movement of individuals who are not natives or do not possess citizenship in order to settle, reside, study or to take-up employment in the United States. It has been a major source of population growth and cultural change throughout much of the history of the United States. The economic, social, and political aspects of immigration have caused controversy regarding ethnicity, economic benefits, jobs for non-immigrants, settlement patterns, impact on upward social mobility, crime, and voting behavior.
Prior to 1965, policies such as the national origins formula limited immigration and naturalization opportunities for people from areas outside Western Europe.
Exclusion laws enacted as early as the 1880s generally prohibited or severely restricted immigration from Asia, and quota laws enacted in the 1920s curtailed Eastern European immigration.
The Civil Rights Movement led to the replacement of these ethnic quotas with per-country limits. Since then, the number of first-generation immigrants living in the United States has quadrupled.
As for economic effects, research suggests that immigration to the United States is beneficial to the US economy. Research, with few exceptions, finds that immigration on average has positive economic effects on the native population, but is mixed as to whether low-skilled immigration adversely affects low-skilled natives. Research finds that immigration either has no impact on the crime rate or that it reduces the crime rate in the United States. Research shows that the United States excels at assimilating first- and second-generation immigrants relative to many other Western countries.
Click on any of the following blue hyperlinks for more about Immigration to the United States:
Illegal immigration is the entry of a person or a group of persons across a country's border, in a way that violates the immigration laws of the destination country, with the intention to remain in the country.
Illegal immigration, as well as immigration in general, is overwhelmingly upward, from a poorer to a richer country. However, it is also noted that illegal immigrants tend not to be the poorest within the populations they emigrate from.
Reasons for taking the risk of living illegally in another country are not only the expected improvements in income and living conditions, but also the hope of eventually being allowed to remain in the country legally, as there may be a path to becoming naturalized. Living in another country illegally includes a variety of restrictions as well as the risk of being detained and deported or of facing other sanctions. If the status of being illegal is any way perceivable to host community residents, illegal migrants may additionally face visible or verbal disdain.
Click here for Illegal immigrant population of the United States.
Click on any of the following blue hyperlinks for more about Illegal Immigration in the United States:
Approximately 11 million illegal immigrants are estimated to be living in the United States. Estimates from the Pew Hispanic Center show the number of illegal immigrants has declined to 11.1 million in March 2009 from a peak of 12 million in March 2007. The majority of the illegal immigrants are from Mexico. The issue of illegal immigration has long been controversial in the United States.
In 2007, President George W. Bush called for Congress to endorse his guest worker proposal, stating that illegal immigrants took jobs that Americans would not take.
The Pew Hispanic Center notes that while the number of legal immigrants arriving has not varied substantially since the 1980s, the number of illegal immigrants has increased dramatically and, since the mid-1990s, has surpassed the number of legal immigrants. Penalties for employers of illegal immigrants, of $2,000–$10,000 and up to six months' imprisonment, go largely unenforced.
Political groups like Americans for Legal Immigration have been formed to demand enforcement of immigration laws and secure borders. ALIPAC has also called for "safe departure" border checkpoints, free of criminal checks.
In a 2011 news story, the Los Angeles Times reported:
"... illegal immigrants in 2010 were parents of 5.5 million children, 4.5 million of whom were born in the U.S. and are citizens. Because illegal immigrants are younger and more likely to be married, they represented a disproportionate share of births—8% of the babies born in the U.S. between March 2009 and March 2010 were to at least one illegal immigrant parent.
Immigration from Mexico to the United States has slowed in recent years. This has been attributed to the slowing of the U.S. economy, the buildup in security along the border and increased violence on the Mexican side of the border.
Prior to 1965, policies such as the national origins formula limited immigration and naturalization opportunities for people from areas outside Western Europe.
Exclusion laws enacted as early as the 1880s generally prohibited or severely restricted immigration from Asia, and quota laws enacted in the 1920s curtailed Eastern European immigration.
The Civil Rights Movement led to the replacement of these ethnic quotas with per-country limits. Since then, the number of first-generation immigrants living in the United States has quadrupled.
As for economic effects, research suggests that immigration to the United States is beneficial to the US economy. Research, with few exceptions, finds that immigration on average has positive economic effects on the native population, but is mixed as to whether low-skilled immigration adversely affects low-skilled natives. Research finds that immigration either has no impact on the crime rate or that it reduces the crime rate in the United States. Research shows that the United States excels at assimilating first- and second-generation immigrants relative to many other Western countries.
Click on any of the following blue hyperlinks for more about Immigration to the United States:
- History
- Contemporary immigration
- Ethnicity
- Demography
- Effects of immigration
- Public opinion
- Legal issues
- Immigration in popular culture
- Immigration in literature
- Documentary films
- Legal perspectives
- Interpretive perspectives
- See also:
- Demographics of the United States
- European colonization of the Americas
- History of laws concerning immigration and naturalization in the United States
- How Democracy Works Now: Twelve Stories
- Inequality within immigrant families (United States)
- Nativism (politics), opposition to immigration
- Opposition to immigration
- United States immigration statistics
- Immigrant benefits urban legend, a hoax regarding benefits comparison
- Surveys
- History:
- Immigration policy:
- Current immigration:
- U.S. Citizenship and Immigration Services
- U.S. Immigration and Customs Enforcement
- Cornell University's Legal Information Institute: Immigration
- Yearbook of Immigration Statistics – United States Department of Homeland Security, Office of Immigration Statistics 2004, 2005 editions available.
- "Estimates of the Unauthorized Immigrant Population Residing in the United States: January 2005" M. Hoefer, N. Rytina, C. Campbell (2006) "Population Estimates (August). U.S. Department of Homeland Security, Office of Immigration Statistics.
- Films about immigration:
- How Democracy Works Now: Twelve Stories (2010)
- Well-Founded Fear (1999)
- Economic impact:
- Immigration in American Economic History by Ran Abramitzky and Leah Platt Boustan, NBER Working Paper No. 21882, January 2016
- The New Political Economy of Immigration by Tom Barry in Dollars & Sense magazine, January/February 2009
- Immigrants and the Labor Market from Dollars & Sense magazine, May/June 2006
- Immigrants in Black & White: A Review of "Communities Without Borders", The Indypendent, Susan Chenelle
- Immigration, Numbers, NumbersUSA: For Lower Immigration Levels
- Immigration, World Poverty and Gumballs - Updated 2010 - YouTube (6:07)
Illegal immigration is the entry of a person or a group of persons across a country's border, in a way that violates the immigration laws of the destination country, with the intention to remain in the country.
Illegal immigration, as well as immigration in general, is overwhelmingly upward, from a poorer to a richer country. However, it is also noted that illegal immigrants tend not to be the poorest within the populations they emigrate from.
Reasons for taking the risk of living illegally in another country are not only the expected improvements in income and living conditions, but also the hope of eventually being allowed to remain in the country legally, as there may be a path to becoming naturalized. Living in another country illegally includes a variety of restrictions as well as the risk of being detained and deported or of facing other sanctions. If the status of being illegal is any way perceivable to host community residents, illegal migrants may additionally face visible or verbal disdain.
Click here for Illegal immigrant population of the United States.
Click on any of the following blue hyperlinks for more about Illegal Immigration in the United States:
- Terminology
- Reasons for illegal immigration
- Problems faced by illegal immigrants
- Methods
- Information on illegal immigrant populations by country or region
Approximately 11 million illegal immigrants are estimated to be living in the United States. Estimates from the Pew Hispanic Center show the number of illegal immigrants has declined to 11.1 million in March 2009 from a peak of 12 million in March 2007. The majority of the illegal immigrants are from Mexico. The issue of illegal immigration has long been controversial in the United States.
In 2007, President George W. Bush called for Congress to endorse his guest worker proposal, stating that illegal immigrants took jobs that Americans would not take.
The Pew Hispanic Center notes that while the number of legal immigrants arriving has not varied substantially since the 1980s, the number of illegal immigrants has increased dramatically and, since the mid-1990s, has surpassed the number of legal immigrants. Penalties for employers of illegal immigrants, of $2,000–$10,000 and up to six months' imprisonment, go largely unenforced.
Political groups like Americans for Legal Immigration have been formed to demand enforcement of immigration laws and secure borders. ALIPAC has also called for "safe departure" border checkpoints, free of criminal checks.
In a 2011 news story, the Los Angeles Times reported:
"... illegal immigrants in 2010 were parents of 5.5 million children, 4.5 million of whom were born in the U.S. and are citizens. Because illegal immigrants are younger and more likely to be married, they represented a disproportionate share of births—8% of the babies born in the U.S. between March 2009 and March 2010 were to at least one illegal immigrant parent.
Immigration from Mexico to the United States has slowed in recent years. This has been attributed to the slowing of the U.S. economy, the buildup in security along the border and increased violence on the Mexican side of the border.
Gender Inequality in the United States
- YouTube Video: Why did the U.S. rank so low for gender equality? (CBS News Oct. 27, 2016)
- YouTube Video: What stands in the way of women being equal to men? (BBC News March 26, 2017)
- YouTube Video: Why Gender Equality Is Good for Everyone — Men Included Michael Kimmel | TED Talks
Gender inequality in the United States has been diminishing throughout its history and significant advancements towards equality have been made beginning mostly in the early 1900s.
However, despite this progress, gender inequality in the United States continues to persist in many forms, including the disparity in women's political representation and participation, occupational segregation, the gender pay gap, and the unequal distribution of household labor.
In the past 20 years there have been emerging issues for boys/men, an achievement and attainment gap in education is a discussed subject. The alleviation of gender inequality has been the goal of several major pieces of legislation since 1920 and continuing to the present day.
As of 2017, the World Economic Forum ranked the United States 49th best in terms of gender equality out of 144 countries.
In addition to the inequality faced by transgender women, inequality, prejudice, and violence against transgender men and women, as well as gender nonconforming individuals and individuals who identify with genders outside the gender binary, are also prevalent in the United States.
Transgender individuals suffer from prejudices in the workforce and employment, higher levels of domestic violence, higher rates of hate crimes, especially murder, and higher levels of police brutality when compared to the cisgender population.
Current Issues for Women:
Political participation:
The Center for American Women and Politics reports that, as of 2013, 18.3% of congressional seats are held by women and 23% of statewide elective offices are held by women; while the percentage of Congress made up of women has steadily increased, statewide elective positions held by women have decreased from their peak of 27.6% in 2001.
Women also make up, as of 2013, 24.2% of state legislators in the United States. Among the one hundred largest cities in the United States, ten had female mayors as of 2013.
In 1977, political science professor Susan Welch presented three possible explanations for this under-representation of women in politics:
In 2001, M. Margaret Conway, political science professor at the University of Florida, also presented three possible explanations for the continuation of this disparity: one, similar to Welch's first explanation, sociological and societal norm discourages women from running; two, women less frequently acquire the necessary skills to hold a political leadership position from nonpolitical activities; and three, gate-keeping in party politics prevents women from running.
Work life and economics:
The United States is falling behind other Western countries in the percentage of women engaged in the workforce.
Researchers from the Institute for Women's Policy Research at the University of California Hastings College of Law argue that this growing gap is due to a lack of governmental, business and societal support for working women. They ranked the United States last out of 20 industrialized countries in an index that measured such programs as family leave, alternative work arrangements, part-time employment, and other means to make workplaces more flexible and family-friendly.
The United States is also the only industrialized nation that does not have a paid parental leave policy mandated by law, and is one of only four countries worldwide that does not; in addition, fully paid maternity leave is only offered by around 16 percent of employers in the United States.
Sex discrimination in employment:
According to a study conducted by researchers at California State University, Northridge, when an individual with a PhD applies for a position at a university, that individual is significantly more likely to be offered a higher level of appointment, receive an offer of an academic position leading to tenure, and be offered a full professorship if they are a man when compared to a woman of comparable qualifications.
However, these findings have been disputed, with one study finding universities pushed to hire more women, resulting in females being given a 2:1 advantage over males in science, technology engineering and mathematics fields.
Another study found that women were significantly less likely to receive a job offer or an interview for a high-paying waiter position when compared to equally qualified men; this study also found that such hiring discrimination may be caused in part by customer's discrimination of preference for male wait staff.
Similarly, research conducted at the University of California, Davis focusing on academic dermatology revealed a significant downward trend in the number of women receiving funding from the National Institutes of Health, which the authors concluded was due to a lack of support for women scientists at their home institutions.
Research from Lawrence University has found that men were more likely to be hired in traditionally masculine jobs, such as sales management, and women were more likely to be hired in traditionally feminine jobs, such as receptionist or secretary. However, individuals of either gender with masculine personality traits were advantaged when applying for either masculine or feminine jobs, indicating a possibly valuing of stereotypically male traits above stereotypically female traits.
Occupational segregation by gender:
Main article: Occupational segregation
Occupational gender segregation takes the form of both horizontal segregation (the unequal gender distribution across occupations) and vertical segregation (the overrepresentation of men in higher positions in both traditionally male and traditionally female fields).
According to William A. Darity, Jr. and Patrick L. Mason, there is a strong horizontal occupational division in the United States on the basis of gender; in 1990, the index of occupational dissimilarity was 53%, meaning 53% of women or 47% of men would have to move to different career field in order for all occupations to have equal gender composition.
While women have begun to more frequently enter traditionally male-dominated professions, there have been much fewer men entering female-dominated professions; professor of sociology Paula England cites this horizontal segregation of careers as a contributing factor to the gender pay gap.
Pay gap:
Main article: Gender pay gap in the United States
With regards to the gender pay gap in the United States, International Labour Organization notes as of 2010 women in the United States earned about 81% of what their male counterparts did. While the gender pay gap has been narrowing since the passage of the Equal Pay Act, the convergence began to slow down in the 1990s.
In addition, overall wage inequality has been increasing since the 1980s as middle-wage jobs are decreasing replaced by larger percentages of both high-paying and low-paying jobs, creating a highly polarized environment.
However numerous studies dispute the claim that discrimination accounts for the majority of the pay gap. When adjusting for industries commonly chosen, hours worked, and benefits received, the pay gap returns to 5%, which has been attributed to less aggressive pay negotiating in women.
One study actually found that before 30, females made more than males, and hypothesized that choosing a family over a career resulted in the drop of the female wage advantage during the thirties.
According to researchers at the University of California, Berkeley and the University of Illinois at Urbana-Champaign, the primary cause of this gap is discrimination manifested in the tendency of women to be hired more frequently in lower paying occupations, in addition to the fact that male dominated occupations are higher paying than female dominated occupations, and that, even within comparable occupations, women are often paid less than men.
In medicine, female physicians are compensated less, despite the fast that evidence suggest that the quality of care female physicians provide may be higher than that of male physicians.
In addition to the gender pay gap, a "family gap" also exists, wherein women with children receive about 10-15% less pay when compared to women without children. According to Jane Waldfogel, professor of social work and public affairs at Columbia University, this family gap is a contributing factor to the United States' large gender pay gap. She also noted that men did not seem to be affected by this gap, as married men (who are more likely to have children) generally earned higher than unmarried men.
Social life:
Researchers from the University of Michigan have found that from 1970 to 1985, the percentage of men and women who supported traditional social roles for wives and believed that maternal employment damages mother-child relationships or children's development decreased.
Similarly, Jane Wilke from the University of Connecticut found that men's support the idea that men should be the sole source of income in a married couple decreased from 32 to 21 percent from 1972 to 1989; in practice only 15 percent of households were supported by a male spouse's income alone at the time of the study.
However, more recent research in 2011 has found that attitudes towards gender and societal roles have changed very little since the mid-1990s, with attitudes hovering at about sixty to seventy percent egalitarian. This study theorized that a "egalitarian but traditional" gender frame emerged in popular culture during this period, which supports each gender assuming their traditional roles without appearing sexist or discriminatory, and is responsible for this backlash.
Stephanie Coontz, a professor of family history at Evergreen State College, noted that one of the factors contributing to the gender inequality in the United States is that most men still expect women and men to assume traditional gender roles in the households and for women to carry out a larger share of the housework.
This has been confirmed by a number of other studies; for example Makiko Fuwa from University of California, Irvine noted that while there has been movement towards greater equality, "in 1995 American women still spent nearly twice as much time on housework than men" and there is also a segregation of household tasks.
This gendered division of household labor creates what is known as the second shift or double burden, where working women in a heterosexual couple with a working partner spend significally more time on childcare and household chores.
Researchers from the University of Maryland have found that while men have steadily begun to perform more household labor since 1965, most of the essential and traditionally feminine tasks are still carried out by women; men generally carry out more nonessential or infrequent tasks, such as taking out the trash or mowing the lawn.
While both genders tend to have roughly equal amounts of leisure time, men have more uninterrupted leisure time when compared to women. Working mothers also tend to get less sleep when compared to their working husbands.
Education:
Literacy and enrollment in primary and secondary education are at parity in the United States, and women are over-represented in tertiary education. There is, however, a notably gender segregation in degree choice, correlated with lower incomes for graduates with "feminine" degrees, such as education or nursing, and higher incomes for those with "masculine" degrees, such as engineering.
In addition, men have a statistically significant advantage over women when applying for highly selective universities, despite the fact that women generally outperform men in high school.
Females started outnumbering males in higher education in 1992.
Other issues:
Research conducted at Lycoming College has found the enjoyment of sexist humor to be strongly correlated with sexual aggression towards women among male college students. In addition, studies have shown that exposure to sexist humor, particularly humor related to sexual assault, can increase male aggression and their tendency to discriminate against women.
One study also asserted that the attitudes behind such humor creates an environment where such discriminatory and possibly violent behavior is acceptable. Men's tendency to self-report the likelihood that they would commit sexually violent acts has also been found to increase after exposure to sexist humor, as reported by researchers from the University of Kent.
Benevolent sexism, sometimes referred to as chivalry, which holds women as something to be protected, also has psychological effects. Women who hold these views are more likely to have less ambitious career goals and men who hold these views tend to have a polarized and stereotyped view of women, made up of both very favorable and very unfavorable traits.
In such cases, the stereotyped view of women is "favorable in content and yet prejudicial in [its] consequences," and attempts to provide justification for discriminatory behaviors presented as helpful or paternal.
Click on any of the following blue hyperlinks for more about Gender Inequality in the United States:
However, despite this progress, gender inequality in the United States continues to persist in many forms, including the disparity in women's political representation and participation, occupational segregation, the gender pay gap, and the unequal distribution of household labor.
In the past 20 years there have been emerging issues for boys/men, an achievement and attainment gap in education is a discussed subject. The alleviation of gender inequality has been the goal of several major pieces of legislation since 1920 and continuing to the present day.
As of 2017, the World Economic Forum ranked the United States 49th best in terms of gender equality out of 144 countries.
In addition to the inequality faced by transgender women, inequality, prejudice, and violence against transgender men and women, as well as gender nonconforming individuals and individuals who identify with genders outside the gender binary, are also prevalent in the United States.
Transgender individuals suffer from prejudices in the workforce and employment, higher levels of domestic violence, higher rates of hate crimes, especially murder, and higher levels of police brutality when compared to the cisgender population.
Current Issues for Women:
Political participation:
The Center for American Women and Politics reports that, as of 2013, 18.3% of congressional seats are held by women and 23% of statewide elective offices are held by women; while the percentage of Congress made up of women has steadily increased, statewide elective positions held by women have decreased from their peak of 27.6% in 2001.
Women also make up, as of 2013, 24.2% of state legislators in the United States. Among the one hundred largest cities in the United States, ten had female mayors as of 2013.
In 1977, political science professor Susan Welch presented three possible explanations for this under-representation of women in politics:
- One, that women are socialized to avoid careers in politics;
- two, that women's responsibilities in the home keep them away out of both the work force and the political arena;
- and three, women are more often than men members of other demographic groups with low political participation rates.
In 2001, M. Margaret Conway, political science professor at the University of Florida, also presented three possible explanations for the continuation of this disparity: one, similar to Welch's first explanation, sociological and societal norm discourages women from running; two, women less frequently acquire the necessary skills to hold a political leadership position from nonpolitical activities; and three, gate-keeping in party politics prevents women from running.
Work life and economics:
The United States is falling behind other Western countries in the percentage of women engaged in the workforce.
Researchers from the Institute for Women's Policy Research at the University of California Hastings College of Law argue that this growing gap is due to a lack of governmental, business and societal support for working women. They ranked the United States last out of 20 industrialized countries in an index that measured such programs as family leave, alternative work arrangements, part-time employment, and other means to make workplaces more flexible and family-friendly.
The United States is also the only industrialized nation that does not have a paid parental leave policy mandated by law, and is one of only four countries worldwide that does not; in addition, fully paid maternity leave is only offered by around 16 percent of employers in the United States.
Sex discrimination in employment:
According to a study conducted by researchers at California State University, Northridge, when an individual with a PhD applies for a position at a university, that individual is significantly more likely to be offered a higher level of appointment, receive an offer of an academic position leading to tenure, and be offered a full professorship if they are a man when compared to a woman of comparable qualifications.
However, these findings have been disputed, with one study finding universities pushed to hire more women, resulting in females being given a 2:1 advantage over males in science, technology engineering and mathematics fields.
Another study found that women were significantly less likely to receive a job offer or an interview for a high-paying waiter position when compared to equally qualified men; this study also found that such hiring discrimination may be caused in part by customer's discrimination of preference for male wait staff.
Similarly, research conducted at the University of California, Davis focusing on academic dermatology revealed a significant downward trend in the number of women receiving funding from the National Institutes of Health, which the authors concluded was due to a lack of support for women scientists at their home institutions.
Research from Lawrence University has found that men were more likely to be hired in traditionally masculine jobs, such as sales management, and women were more likely to be hired in traditionally feminine jobs, such as receptionist or secretary. However, individuals of either gender with masculine personality traits were advantaged when applying for either masculine or feminine jobs, indicating a possibly valuing of stereotypically male traits above stereotypically female traits.
Occupational segregation by gender:
Main article: Occupational segregation
Occupational gender segregation takes the form of both horizontal segregation (the unequal gender distribution across occupations) and vertical segregation (the overrepresentation of men in higher positions in both traditionally male and traditionally female fields).
According to William A. Darity, Jr. and Patrick L. Mason, there is a strong horizontal occupational division in the United States on the basis of gender; in 1990, the index of occupational dissimilarity was 53%, meaning 53% of women or 47% of men would have to move to different career field in order for all occupations to have equal gender composition.
While women have begun to more frequently enter traditionally male-dominated professions, there have been much fewer men entering female-dominated professions; professor of sociology Paula England cites this horizontal segregation of careers as a contributing factor to the gender pay gap.
Pay gap:
Main article: Gender pay gap in the United States
With regards to the gender pay gap in the United States, International Labour Organization notes as of 2010 women in the United States earned about 81% of what their male counterparts did. While the gender pay gap has been narrowing since the passage of the Equal Pay Act, the convergence began to slow down in the 1990s.
In addition, overall wage inequality has been increasing since the 1980s as middle-wage jobs are decreasing replaced by larger percentages of both high-paying and low-paying jobs, creating a highly polarized environment.
However numerous studies dispute the claim that discrimination accounts for the majority of the pay gap. When adjusting for industries commonly chosen, hours worked, and benefits received, the pay gap returns to 5%, which has been attributed to less aggressive pay negotiating in women.
One study actually found that before 30, females made more than males, and hypothesized that choosing a family over a career resulted in the drop of the female wage advantage during the thirties.
According to researchers at the University of California, Berkeley and the University of Illinois at Urbana-Champaign, the primary cause of this gap is discrimination manifested in the tendency of women to be hired more frequently in lower paying occupations, in addition to the fact that male dominated occupations are higher paying than female dominated occupations, and that, even within comparable occupations, women are often paid less than men.
In medicine, female physicians are compensated less, despite the fast that evidence suggest that the quality of care female physicians provide may be higher than that of male physicians.
In addition to the gender pay gap, a "family gap" also exists, wherein women with children receive about 10-15% less pay when compared to women without children. According to Jane Waldfogel, professor of social work and public affairs at Columbia University, this family gap is a contributing factor to the United States' large gender pay gap. She also noted that men did not seem to be affected by this gap, as married men (who are more likely to have children) generally earned higher than unmarried men.
Social life:
Researchers from the University of Michigan have found that from 1970 to 1985, the percentage of men and women who supported traditional social roles for wives and believed that maternal employment damages mother-child relationships or children's development decreased.
Similarly, Jane Wilke from the University of Connecticut found that men's support the idea that men should be the sole source of income in a married couple decreased from 32 to 21 percent from 1972 to 1989; in practice only 15 percent of households were supported by a male spouse's income alone at the time of the study.
However, more recent research in 2011 has found that attitudes towards gender and societal roles have changed very little since the mid-1990s, with attitudes hovering at about sixty to seventy percent egalitarian. This study theorized that a "egalitarian but traditional" gender frame emerged in popular culture during this period, which supports each gender assuming their traditional roles without appearing sexist or discriminatory, and is responsible for this backlash.
Stephanie Coontz, a professor of family history at Evergreen State College, noted that one of the factors contributing to the gender inequality in the United States is that most men still expect women and men to assume traditional gender roles in the households and for women to carry out a larger share of the housework.
This has been confirmed by a number of other studies; for example Makiko Fuwa from University of California, Irvine noted that while there has been movement towards greater equality, "in 1995 American women still spent nearly twice as much time on housework than men" and there is also a segregation of household tasks.
This gendered division of household labor creates what is known as the second shift or double burden, where working women in a heterosexual couple with a working partner spend significally more time on childcare and household chores.
Researchers from the University of Maryland have found that while men have steadily begun to perform more household labor since 1965, most of the essential and traditionally feminine tasks are still carried out by women; men generally carry out more nonessential or infrequent tasks, such as taking out the trash or mowing the lawn.
While both genders tend to have roughly equal amounts of leisure time, men have more uninterrupted leisure time when compared to women. Working mothers also tend to get less sleep when compared to their working husbands.
Education:
Literacy and enrollment in primary and secondary education are at parity in the United States, and women are over-represented in tertiary education. There is, however, a notably gender segregation in degree choice, correlated with lower incomes for graduates with "feminine" degrees, such as education or nursing, and higher incomes for those with "masculine" degrees, such as engineering.
In addition, men have a statistically significant advantage over women when applying for highly selective universities, despite the fact that women generally outperform men in high school.
Females started outnumbering males in higher education in 1992.
Other issues:
Research conducted at Lycoming College has found the enjoyment of sexist humor to be strongly correlated with sexual aggression towards women among male college students. In addition, studies have shown that exposure to sexist humor, particularly humor related to sexual assault, can increase male aggression and their tendency to discriminate against women.
One study also asserted that the attitudes behind such humor creates an environment where such discriminatory and possibly violent behavior is acceptable. Men's tendency to self-report the likelihood that they would commit sexually violent acts has also been found to increase after exposure to sexist humor, as reported by researchers from the University of Kent.
Benevolent sexism, sometimes referred to as chivalry, which holds women as something to be protected, also has psychological effects. Women who hold these views are more likely to have less ambitious career goals and men who hold these views tend to have a polarized and stereotyped view of women, made up of both very favorable and very unfavorable traits.
In such cases, the stereotyped view of women is "favorable in content and yet prejudicial in [its] consequences," and attempts to provide justification for discriminatory behaviors presented as helpful or paternal.
Click on any of the following blue hyperlinks for more about Gender Inequality in the United States:
- Current issues for men
- Current issues for transgender people
- Government policy
- Rankings
- See also:
- 21st-century globalization impacts on gender inequality in the United States
- Affirmative action
- Civil Rights Act
- Double burden
- Education Amendments of 1972, Title IX
- Employment discrimination law in the United States
- Equal Pay Act of 1963
- Gender role
- Lilly Ledbetter
- Work-family balance in the United States
- New report documents persistent gender inequalities in U.S. media
Gender Pay Gap in the United States including the Gender Gap Report by the World Economic ForumPicture: Chart showing gains in income that women have made in comparison to income by men, 1979-2005
The gender pay gap in the United States is the ratio of female-to-male median yearly earnings among full-time, year-round workers.
The average woman's unadjusted annual salary has been cited as 78% to 82% of that of the average man's.
However, after adjusting for choices made by male and female workers in college major, occupation, working hours, and parental leave, multiple studies find that pay rates between males and females varied by 5–6.6% or, females earning 94 cents to every dollar earned by their male counterparts.
The remaining 6% of the gap has been speculated to originate from gender discrimination and a gender difference in ability and willingness to negotiate salaries.
The extent to which discrimination plays a role in explaining gender wage disparities is somewhat difficult to quantify, due to a number of potentially confounding variables.
A 2010 research review by the majority staff of the United States Congress Joint Economic Committee reported that studies have consistently found unexplained pay differences even after controlling for measurable factors that are assumed to influence earnings – suggestive of unknown/unmeasurable contributing factors of which gender discrimination may be one. Other studies have found direct evidence of discrimination – for example, more jobs went to women when the applicant's sex was unknown during the hiring process.
Click on any of the following blue hyperlinks for more about the Gender Pay Gap in the United States:
Gender Gap Report by the World Economic Forum (see below)
The Global Gender Gap Report was first published in 2006 by the World Economic Forum. The 2016 report covers 144 major and emerging economies. The Global Gender Gap Index is an index designed to measure gender equality.
The report’s Gender Gap Index ranks countries according to calculated gender gap between women and men in four key areas: health, education and survival to gauge the state of gender equality in a country.
The report measures women's disadvantage compared to men, and is not strictly a measure of equality. Gender imbalances to the advantage of women do not affect the score. So, for example, the indicator "number of years of a female head of state (last 50 years) over male value" would score 1 if the number of years was 25, but would still score 1 if the number of years was 50.
Due to this methodology, gender gaps that favor women over men are reported as equality, for example the life expectancy of women being longer than men in a country would not cause deficits of equality in other areas to become less visible in the score.
The three highest ranking countries have closed over 84% of their gender gaps, while the lowest ranking country has closed only a little over 50% of its gender gap. It "assesses countries on how well they are dividing their resources and opportunities among their male and female populations, regardless of the overall levels of these resources and opportunities," the Report says.
"By providing a comprehensible framework for assessing and comparing global gender gaps and by revealing those countries that are role models in dividing these resources equitably between women and men, the Report serves as a catalyst for greater awareness as well as greater exchange between policymakers."
The report examines four overall areas of inequality between men and women in 130 economies around the globe, over 93% of the world’s population:
Thirteen out of the fourteen variables used to create the index are from publicly available "hard data" indicators from international organizations, such as the International Labour Organization, the United Nations Development Programme and the World Health Organization.
Click on any of the following blue hyperlinks for more about the Global Gender Gap Report:
About the World Economic Forum
The World Economic Forum (WEF) is a Swiss nonprofit foundation, based in Cologny, Geneva, Switzerland.
Recognized in 2015 by the Swiss authorities as an "other international body" under Switzerland's Host State Act 2007 (HSA, SR 192.12), its mission is cited as "committed to improving the state of the world by engaging business, political, academic, and other leaders of society to shape global, regional, and industry agendas".
The forum is best known for its annual meeting at the end of January in Davos, a mountain resort in Graubünden, in the eastern Alps region of Switzerland. The meeting brings together some 2,500 top business leaders, international political leaders, economists, celebrities and journalists for up to four days to discuss the most pressing issues facing the world.
Often this location alone is used to identify meetings, participation, and participants, with such phrases as "a Davos panel" and "Davos man" being used.
The organization also convenes some six to eight regional meetings each year in locations across Africa, East Asia, and Latin America, and holds two further annual meetings in China, India and the United Arab Emirates. Beside meetings, the foundation produces a series of research reports and engages its members in sector-specific initiatives.
Click here for more about the World Economic Forum.
The average woman's unadjusted annual salary has been cited as 78% to 82% of that of the average man's.
However, after adjusting for choices made by male and female workers in college major, occupation, working hours, and parental leave, multiple studies find that pay rates between males and females varied by 5–6.6% or, females earning 94 cents to every dollar earned by their male counterparts.
The remaining 6% of the gap has been speculated to originate from gender discrimination and a gender difference in ability and willingness to negotiate salaries.
The extent to which discrimination plays a role in explaining gender wage disparities is somewhat difficult to quantify, due to a number of potentially confounding variables.
A 2010 research review by the majority staff of the United States Congress Joint Economic Committee reported that studies have consistently found unexplained pay differences even after controlling for measurable factors that are assumed to influence earnings – suggestive of unknown/unmeasurable contributing factors of which gender discrimination may be one. Other studies have found direct evidence of discrimination – for example, more jobs went to women when the applicant's sex was unknown during the hiring process.
Click on any of the following blue hyperlinks for more about the Gender Pay Gap in the United States:
- Statistics
- Explaining the gender pay gap
- Sources of disparity
- Impact
- Current policy solutions
- Popular culture reactions
- See also:
Gender Gap Report by the World Economic Forum (see below)
The Global Gender Gap Report was first published in 2006 by the World Economic Forum. The 2016 report covers 144 major and emerging economies. The Global Gender Gap Index is an index designed to measure gender equality.
The report’s Gender Gap Index ranks countries according to calculated gender gap between women and men in four key areas: health, education and survival to gauge the state of gender equality in a country.
The report measures women's disadvantage compared to men, and is not strictly a measure of equality. Gender imbalances to the advantage of women do not affect the score. So, for example, the indicator "number of years of a female head of state (last 50 years) over male value" would score 1 if the number of years was 25, but would still score 1 if the number of years was 50.
Due to this methodology, gender gaps that favor women over men are reported as equality, for example the life expectancy of women being longer than men in a country would not cause deficits of equality in other areas to become less visible in the score.
The three highest ranking countries have closed over 84% of their gender gaps, while the lowest ranking country has closed only a little over 50% of its gender gap. It "assesses countries on how well they are dividing their resources and opportunities among their male and female populations, regardless of the overall levels of these resources and opportunities," the Report says.
"By providing a comprehensible framework for assessing and comparing global gender gaps and by revealing those countries that are role models in dividing these resources equitably between women and men, the Report serves as a catalyst for greater awareness as well as greater exchange between policymakers."
The report examines four overall areas of inequality between men and women in 130 economies around the globe, over 93% of the world’s population:
- Economic participation and opportunity – outcomes on salaries, participation levels and access to high-skilled employment
- Educational attainment – outcomes on access to basic and higher level education
- Political empowerment – outcomes on representation in decision-making structures
- Health and survival – outcomes on life expectancy and sex ratio. In this case parity is not assumed, there are assumed to be fewer female births than male (944 female for every 1,000 males), and men are assumed to die younger. Provided that women live at least six percent longer than men, parity is assumed. But if it is less than six percent it counts as a gender gap.
Thirteen out of the fourteen variables used to create the index are from publicly available "hard data" indicators from international organizations, such as the International Labour Organization, the United Nations Development Programme and the World Health Organization.
Click on any of the following blue hyperlinks for more about the Global Gender Gap Report:
- WEF Global Gender Gap Index rankings
- Controversies
- See also:
- Gender Empowerment Measure
- Gender Inequality Index
- Gender-Related Development Index
- Social Institutions and Gender Index
- Female labour force in the Muslim world
- Women's rights in 2014
- Gender Gap in Stem Careers Infographic
- "Women Leaders and Gender Parity". World Economic Forum. Geneva, Switzerland.
- Daily chart: Sex and equality, The Economist, Oct 25th 2013
- Reports:
- Ricardo Hausmann; Laura D. Tyson; Yasmina Bekhouche; Saadia Zahidi (2014). The Global Gender Gap Index 2014 (PDF). World Economic Forum, Geneva, Switzerland. Retrieved 2014-11-26.
- Ricardo Hausmann, Laura D. Tyson, Saadia Zahidi, Editors (2013). The Global Gender Gap Report 2013 (PDF). World Economic Forum, Geneva, Switzerland. Retrieved 2013-10-26.
- Ricardo Hausmann, Laura D. Tyson, Saadia Zahidi, Editors (2012). The Global Gender Gap Report 2012 (PDF). World Economic Forum, Geneva, Switzerland. Retrieved 2012-10-26.
- The Global Gender Gap Report 2010 (PDF).
- Ricardo Hausmann, Laura D. Tyson, Saadia Zahidi, Editors (2010). The Global Gender Gap Report 2010 (PDF). World Economic Forum, Geneva, Switzerland. Retrieved 2010-10-20.
- Ricardo Hausmann, Laura D. Tyson, Saadia Zahidi, Editors (2009). The Global Gender Gap Report 2009 (PDF). World Economic Forum, Geneva, Switzerland.
- Ricardo Hausmann, Laura D. Tyson, Saadia Zahidi, Editors (2008). The Global Gender Gap Report 2008 (PDF). World Economic Forum, Geneva, Switzerland. Retrieved 2008-11-19.
- Ricardo Hausmann, Laura D. Tyson, Saadia Zahidi, Editors (2007). The Global Gender Gap Report 2007 (PDF). World Economic Forum, Geneva, Switzerland. Retrieved 2008-11-19.
- Ricardo Hausmann, Laura D. Tyson, Saadia Zahidi, Editors (2006). The Global Gender Gap Report 2006 (PDF). World Economic Forum, Geneva, Switzerland. Retrieved 2008-11-19.
About the World Economic Forum
The World Economic Forum (WEF) is a Swiss nonprofit foundation, based in Cologny, Geneva, Switzerland.
Recognized in 2015 by the Swiss authorities as an "other international body" under Switzerland's Host State Act 2007 (HSA, SR 192.12), its mission is cited as "committed to improving the state of the world by engaging business, political, academic, and other leaders of society to shape global, regional, and industry agendas".
The forum is best known for its annual meeting at the end of January in Davos, a mountain resort in Graubünden, in the eastern Alps region of Switzerland. The meeting brings together some 2,500 top business leaders, international political leaders, economists, celebrities and journalists for up to four days to discuss the most pressing issues facing the world.
Often this location alone is used to identify meetings, participation, and participants, with such phrases as "a Davos panel" and "Davos man" being used.
The organization also convenes some six to eight regional meetings each year in locations across Africa, East Asia, and Latin America, and holds two further annual meetings in China, India and the United Arab Emirates. Beside meetings, the foundation produces a series of research reports and engages its members in sector-specific initiatives.
Click here for more about the World Economic Forum.
Tax policy and its economic inequality in the United States
- YouTube Video: How tax breaks help the rich
- YouTube Video: Who Benefits From Corporate Tax Cuts? (Velshi & Ruhle | MSNBC)
- YouTube Video: How Amazon Paid $0 Federal Income Tax in 2018
Tax policy and economic inequality in the United States discusses how tax policy affects the distribution of income and wealth in the United States. Income inequality can be measured before- and after-tax; this article focuses on the after-tax aspects.
Income tax rates applied to various income levels and tax expenditures (i.e., deductions, exemptions, and preferential rates that modify the outcome of the rate structure) primarily drive how market results are redistributed to impact the after-tax inequality. After-tax inequality has risen in the United States markedly since 1980, following a more egalitarian period following World War II.
Overview:
Tax policy is the mechanism through which market results are redistributed, affecting after-tax inequality. The provisions of the United States Internal Revenue Code regarding income taxes and estate taxes have undergone significant changes under both Republican and Democratic administrations and Congresses since 1964.
Since the Johnson Administration, the top marginal income tax rates have been reduced from 91% for the wealthiest Americans in 1963, to a low of 35% under George W Bush, rising recently to 39.6% (or in some cases 43.4%) in 2013 under the Obama Administration.
Capital gains taxes have also decreased over the last several years, and have experienced a more punctuated evolution than income taxes as significant and frequent changes to these rates occurred from 1981 to 2011. Both estate and inheritance taxes have been steadily declining since the 1990s.
Economic inequality in the United States (see previous topic) has been steadily increasing since the 1980s as well and economists such as Paul Krugman, Joseph Stiglitz, and Peter Orszag, politicians like Barack Obama and Paul Ryan, and media entities have engaged in debates and accusations over the role of tax policy changes in perpetuating economic inequality.
Tax expenditures (i.e., deductions, exemptions, and preferential tax rates) represent a major driver of inequality, as the top 20% get roughly 50% of the benefit from them, with the top 1% getting 17% of the benefit.
For example, a 2011 Congressional Research Service report stated, "Changes in capital gains and dividends were the largest contributor to the increase in the overall income inequality." CBO estimated tax expenditures would be $1.5 trillion in fiscal year 2017, approximately 8% GDP; for scale, the budget deficit historically has averaged around 3% GDP.
Scholarly and popular literature exists on this topic with numerous works on both sides of the debate. The work of Emmanuel Saez, for example, has concerned the role of American tax policy in aggregating wealth into the richest households in recent years while Thomas Sowell and Gary Becker maintain that education, globalization, and market forces are the root causes of income and overall economic inequality.
The Revenue Act of 1964 and the "Bush Tax Cuts" coincide with the rising economic inequality in the United States both by socioeconomic class and race
Changes in economic inequality:
Income inequality:
Main article: Income inequality in the United States
Economists and related experts have described America's growing income inequality as "deeply worrying", unjust, a danger to democracy/social stability, and a sign of national decline. Yale professor Robert Shiller, who was among three Americans who won the Nobel prize for economics in 2013, said after receiving the award, "The most important problem that we are facing now today, I think, is rising inequality in the United States and elsewhere in the world."
Inequality in land and income ownership is negatively correlated with subsequent economic growth. A strong demand for redistribution may occur in societies where a large section of the population does not have access to the productive resources of the economy. Voters may internalize such issues.
High unemployment rates have a significant negative effect when interacting with increases in inequality. Increasing inequality harms growth in countries with high levels of urbanization. High and persistent unemployment also has a negative effect on subsequent long-run economic growth.
Unemployment may seriously harm growth because it is a waste of resources, generates redistributive pressures and distortions, depreciates existing human capital and deters its accumulation, drives people to poverty, results in liquidity constraints that limit labor mobility, and because it erodes individual self-esteem and promotes social dislocation, unrest and conflict. Policies to control unemployment and reduce its inequality-associated effects can strengthen long-run growth.
Gini coefficient:
Main article: Gini coefficient
The Gini Coefficient, a statistical measurement of the inequality present in a nation's income distribution developed by Italian statistician and sociologist Corrado Gini, for the United States has increased over the last few decades. The closer the Gini Coefficient is to one, the closer its income distribution is to absolute inequality.
In 2007, the United Nations approximated the United States' Gini Coefficient at 41% while the CIA Factbook placed the coefficient at 45%. The United States' Gini Coefficient was below 40% in 1964 and slightly declined through the 1970s. However, around 1981, the Gini Coefficient began to increase and rose steadily through the 2000s.
Wealth distribution:
Main article: Wealth inequality in the United States
Wealth, in economic terms, is defined as the value of an individual's or household's total assets minus his or its total liabilities.
The components of wealth include assets, both monetary and non-monetary, and income. Wealth is accrued over time by savings and investment. Levels of savings and investment are determined by an individual's or a household's consumption, the market real interest rate, and income.
Individuals and households with higher incomes are more capable of saving and investing because they can set aside more of their disposable income to it while still optimizing their consumption functions. It is more difficult for lower-income individuals and households to save and invest because they need to use a higher percentage of their income for fixed and variable costs thus leaving them with a more limited amount of disposable income to optimize their consumption.
Accordingly, a natural wealth gap exists in any market as some workers earn higher wages and thus are able to divert more income towards savings and investment which build wealth.
The wealth gap in the United States is large and the large majority of net worth and financial wealth is concentrated in a relatively very small percentage of the population. Sociologist and University of California-Santa Cruz professor G. William Domhoff writes that "numerous studies show that the wealth distribution has been extremely concentrated throughout American history" and that "most Americans (high income or low income, female or male, young or old, Republican or Democrat) have no idea just how concentrated the wealth distribution actually is."
In 2007, the top 1% of households owned 34.6% of all privately held wealth and the next 19% possessed 50.5% of all privately held wealth. Taken together, 20% of Americans controlled 85.1% of all privately held wealth in the country.
In the same year, the top 1% of households also possessed 42.7% of all financial wealth and the top 19% owned 50.3% of all financial wealth in the country. Together, the top 20% of households owned 93% of the financial wealth in the United States. Financial wealth is defined as "net worth minus net equity in owner-occupied housing."
In real money terms and not just percentage share of wealth, the wealth gap between the top 1% and the other quartiles of the population is immense. The average wealth of households in the top 1% of the population was $13.977 million in 2009. This is fives times as large as the average household wealth for the next four percent (average household wealth of $2.7 million), fifteen times as large as the average household wealth for the next five percent (average household wealth of $908,000), and twenty-nine times the size of the average household wealth of the next ten percent of the population (average household wealth of $477,000) in the same year.
Comparatively, the average household wealth of the lowest quartile was -$27,000 and the average household wealth of the second quartile (bottom 20-40th percentile of the population) was $5,000. The middle class, the middle quartile of the population, has an average household wealth level of $65,000.
According to the Congressional Budget Office, the real, or inflation-adjusted, after-tax earnings of the wealthiest one percent of Americans grew by 275% from 1979 to 2007.
Simultaneously, the real, after-tax earnings of the bottom twenty percent of wage earnings in the United States grew 18%. The difference in the growth of real income of the top 1% and the bottom 20% of Americans was 257%. The average increase in real, after-tax income for all U.S. households during this time period was 62% which is slightly below the real, after-tax income growth rate of 65% experienced by the top 20% of wage earners, not accounting for the top 1%.
Data aggregated and analyzed by Robert B. Reich, Thomas Piketty, and Emmanuel Saez and released in a New York Times article written by Bill Marsh shows that real wages for production and non-supervisory workers, which account for 82% of the U.S. workforce, increased by 100% from 1947 to 1979 but then increased by only 8% from 1979–2009. Their data also shows that the bottom fifth experienced a 122% growth rate in wages from 1947 to 1979 but then experienced a negative growth rate of 4% in their real wages from 1979–2009.
The real wages of the top fifth rose by 99% and then 55% during the same periods, respectively. Average real hourly wages have also increased by a significantly larger rate for the top 20% than they have for the bottom 20%. Real family income for the bottom 20% increased by 7.4% from 1979 to 2009 while it increased by 49% for the top 20% and increased by 22.7% for the second top fifth of American families.
As of 2007, the United Nations estimated the ratio of average income for the top 10% to the bottom 10% of Americans, via the Gini Coefficient, as 15.9:1. The ratio of average income for the top 20% to the bottom 20% in the same year and using the same index was 8.4:1.
According to these UN statistics, the United States has the third highest disparity between the average income of the top 10% and 20% to the bottom 10% and bottom 20% of the population, respectively, of the OECD (Organization for Economic Co-operation and Development) countries.
Only Chile and Mexico have larger average income disparities between the top 10% and bottom 10% of the population with 26:1 and 23:1, respectively. Consequently, the United States has the fourth highest Gini Coefficient of the OECD countries at 40.8% which is lower than Chile's (52%), Mexico's (51%), and just lower than Turkey's (42%).
Tax structure:
A 2011 Congressional Research Service report stated, "Changes in capital gains and dividends were the largest contributor to the increase in the overall income inequality.
Taxes were less progressive in 2006 than in 1996, and consequently, tax policy also contributed to the increase in income inequality between 1996 and 2006. But overall income inequality would likely have increased even in the absence of tax policy changes."
Since 1964, the U.S. income tax, including the capital gains tax, has become less progressive (although recent changes have made the federal tax code the most progressive since 1979). The estate tax, a highly progressive tax, has also been reduced over the last decades.
A progressive tax code is believed to mitigate the effects of recessions by taking a smaller percentage of income from lower-income consumers than from other consumers in the economy so they can spend more of their disposable income on consumption and thus restore equilibrium. This is known as an automatic stabilizer as it does not need Congressional action such as legislation. It also mitigates inflation by taking more money from the wealthiest consumers so their large level of consumption does not create demand-driven inflation.
One argument against the view that tax policy increases income inequality is analysis of the overall share of wealth controlled by the top 1%.
Income Tax:
Main article: Income tax in the United States
The Revenue Act of 1964 was the first bill of the Post-World War II era to reduce marginal income tax rates.
This reform, which was proposed under John F. Kennedy but passed under Lyndon Johnson, reduced the top marginal income (annual income of $2.9 million+ adjusted for inflation) tax rate from 91% (for tax year 1963) to 77% (for tax year 1964) and 70% (for tax year 1965) for annual incomes of $1.4 million+.
It was the first tax legislation to reduce the top end of the marginal income tax rate distribution since 1924. The top marginal income tax rate had been 91% since 1946 and had not been below 70% since 1936.
The "Bush Tax Cuts," which are the popularly known names of the Economic Growth and Tax Relief Reconciliation Act of 2001 and the Jobs and Growth Tax Relief Reconciliation Act of 2003 passed during President George W. Bush's first term, reduced the top marginal income tax rate from 38.6% (annual income at $382,967+ adjusted for inflation) to 35%.
These rates were continued under the Obama Administration and will extend through 2013. The number of income tax brackets declined during this time period as well but several years, particularly after 1992, saw an increase in the number of income tax brackets. In 1964, there were 26 income tax brackets.
The number of brackets was reduced to 16 by 1981 and then collapsed into 13 brackets after passage of the Economic Recovery Tax Act of 1981. Five years later, the 13 income tax brackets were collapsed into five under the Reagan Administration. By the end of the G. H. W. Bush administration in 1992, the number of income tax brackets had reached an all-time low of three but President Bill Clinton oversaw a reconfiguration of the brackets that increased the number to five in 1993.
The current number of income tax brackets, as of 2011, is six which is the number of brackets configured under President George W. Bush.
The New York Times reported in July 2018 that: "The top-earning 1 percent of households — those earning more than $607,000 a year — will pay a combined $111 billion less this year in federal taxes than they would have if the laws had remained unchanged since 2000. That's an enormous windfall. It's more, in total dollars, than the tax cut received over the same period by the entire bottom 60 percent of earners." This represents the tax cuts for the top 1% from the Bush tax cuts and Trump tax cuts, partially offset by the tax increases on the top 1% by Obama.
Effective tax rates:
Ronald Reagan made very large reductions in the nominal marginal income tax rates with his Tax Reform Act of 1986, which did not make a similarly large reduction in the effective tax rate on marginal incomes.
Noah writes in his ten part series entitled "The Great Divergence," that "in 1979, the effective tax rate on the top 0.01 percent was 42.9 percent, according to the Congressional Budget Office, but by Reagan's last year in office it was 32.2%."
This effective rate held steadily until the first few years of the Clinton presidency when it increased to a peak high of 41%. However, it fell back down to the low 30s by his second term in the White House. This percentage reduction in the effective marginal income tax rate for the wealthiest Americans, 9%, is not a very large decrease in their tax burden, according to Noah, especially in comparison to the 20% drop in nominal rates from 1980 to 1981 and the 15% drop in nominal rates from 1986 to 1987.
In addition to this small reduction on the income taxes of the wealthiest taxpayers in America, Noah discovered that the effective income tax burden for the bottom 20% of wage earners was 8% in 1979 and dropped to 6.4% under the Clinton Administration.
This effective rate further dropped under the George W. Bush Administration. Under Bush, the rate decreased from 6.4% to 4.3%. Reductions in the effective income tax burden on the poor coinciding with modest reductions in the effective income tax rate on the wealthiest 0.01% of tax payers could not have been the driving cause of increased income inequality that began in the 1980s.
These figures are similar to an analysis of effective federal tax rates from 1979-2005 by the Congressional Budget Office. The figures show a decrease in the total effective tax rate from 37.0% in 1979 to 29% in 1989. The effective individual income tax rate dropped from 21.8% to 19.9% in 1989. However, by 2010, the top 1 percent of all households an average federal tax rate of 29.4 percent, with 2013 rates to be significantly higher.
Capital gains tax:
Main article: Capital gains tax in the United States
Capital gains are profits from investments in capital assets such as bonds, stocks, and real estate. These gains are taxed, for individuals, as ordinary income when held for less than one year which means that they have the same marginal tax rate as the marginal income tax rate of their recipient. This is known as the capital gains tax rate on a short-term capital gains.
Accordingly, the capital gains tax rate for short-term capital gains paid by an individual is equal to the marginal income tax rate of that individual. The tax rate then decreases once the capital gain becomes a long-term capital gain, or is held for 1 year or more.
In 1964, the effective capital gains tax rate was 25%. This means that the actual tax percentage of all capital gains realized in the U.S. in 1964 was 25% as opposed to the nominal capital gains tax rate, or the percentage that would have been collected by the government prior to deductions and evasions.
This effective rate held constant until a small rise in 1968 up to 26.9% and then began steadily increasing until it peaked at 39.875% in 1978. This top rate then fell to 28% in 1979 and further dropped to 20% in 1982.
This top capital gains rate held until 1986 when the Tax Reform Act of 1986 re-raised it to 28% and 33% for all individuals subject to phase-outs. The Tax Reform Act of 1986 shifted capital gains to income for the first time thus establishing equal short-term capital gains taxes and marginal income tax rates. The top rate of 28%, not taking into account taxpayers under the stipulations of a phase-out, remained until 1997, despite increases in marginal income tax rates, when it was lowered to 28%.
Starting in May 1997, however, long-term capital gains were divided into multiple subgroups based on the duration of time investors held them. Each new subgroup had a different tax rate. This effectively reduced the top capital gains tax rate on a long-term capital good held for over 1 year from 28% to 20%.
These multiple subgroups were reorganized into less than one year, one to five years, and five years or more and were in place from 1998 to 2003. In 2003, the divisions reverted to the less than one year and more than one year categories until 2011 when then reverted to the three divisions first implemented in 1998. This rate, 20%, remained until 2003 when it was further reduced to 15%.
The 15% long-term capital gains tax rate was then changed back to its 1997 rate of 20% in 2011. Capital gains taxes for the bottom two and top two income tax brackets have changed significantly since the late 1980s. The short-term and long-term capital gains tax rates for the bottom two tax rates, 15% and 28%, respectively, were equal to those tax payers' marginal income tax rates from 1988 until 1997.
In 1997, the capital gains tax rates for the bottom two income tax brackets were reduced to 10% and 20% for the 15% and 28% income tax brackets, respectively. These rates remained until 2001. President Bush made additional changes to the capital gains tax rates for the bottom two income tax brackets in 2001, which were lowered from 15% and 28% to 10% and 15%, respectively, by lowering the tax on long-term capital gains held for more than five years from 10% to 8%.
Bush also reduced the tax on short-term capital gains from 28% to 15% for the 15% tax bracket as well as lowered the tax on long-term capital goods from 20% to 10%. In 2003, the capital gains tax on long-term capital goods decreased from 10% to 5% for both of the bottom two tax brackets (10% and 15%). In 2008, these same rates were dropped to 0% but were restored to the 2003 rates in 2011 under President Obama via the extension of the Bush Tax Cuts.
Overall, capital gains tax rates decreased significantly for both the bottom two and the top two income tax brackets. The top two income tax brackets have had a net decrease in their long-term capital gains tax rates of 13% since 1988, while the lowest two income tax brackets' long-term capital gains tax rates have changed by 10% and 13%, respectively, in that time.
The difference between income and long-term capital gains taxes for the top two income tax brackets (5% in 1988 and 18% and 20%, respectively, in 2011), however, is larger than the difference between the income and long-term capital gains tax rates for the bottom two income tax brackets (0% in 1988 and 5% and 10%, respectively, in 2011). As of the 2013 tax year, all investment income for high earning households will be subject to a 3.8% surtax bringing the top capital gains rate to 23.8%.
Gift tax:
Main article: Gift tax in the United States
The inheritance tax, which is also known as the "gift tax", has been altered in the Post-World War II era as well.
First established in 1932 as a means to raise tax revenue from the wealthiest Americans, the inheritance tax was put at a nominal rate of 25% points lower than the estate tax which meant its effective rate was 18.7%. Its exemption, up to $50,000, was the same as the estate tax exemption.
Under current law, individuals can give gifts of up to $13,000 without incurring a tax and couples can poll their gift together to give a gift of up to $26,000 a year without incurring a tax. The lifetime gift tax exemption is $5 million which is the same amount as the estate tax exemption.
These two exemptions are directly tied to each other as the amount exempted from one reduces the amount that can be exempted from the other at a 1:1 ratio. The inheritance/gift tax generally affects a very small percentage of the population as most citizens do not inherit anything from their deceased relatives in any given year.
In 2000, the Federal Reserve Bank of Cleveland published a report that found that 1.6% of Americans received an inheritance of $100,000 or more and an additional 1.1% received an inheritance worth $50,000 to $100,000 while 91.9% of Americans did not receive an inheritance.
A 2010 report conducted by Citizens for Tax Justice found that only 0.6% of the population would pass on an inheritance in the event of death in that fiscal year. Accordingly, data shows that inheritance taxes are a tax almost exclusively on the wealthy. In 1986, Congress enacted legislation to prevent trust funds of wealthy individuals from skipping a generation before taxes had to be paid on the inheritance.
Estate tax:
Main article: Estate tax in the United States
Estate taxes, while affecting more taxpayers than inheritance taxes, do not affect many Americans and are also considered to be a tax aimed at the wealthy. In 2007, all of the state governments combined collected $22 billion in tax receipts from estate taxes and these taxes affected less than 5% of the population including less than 1% of citizens in every state.
In 2004, the average tax burden of the federal estate tax was 0% for the bottom 80% of the population by household. The average tax burden of the estate tax for the top 20% was $1,362. The table below gives a general impression of the spread of estate taxes by income.
A certain dollar amount of every estate can be exempted from tax, however. For example, if the government allows an exemption of up to $2 million on an estate then the tax on a $4 million estate would only be paid on $2 million worth of that estate, not all $4 million.
This reduces the effective estate tax rate. In 2001, the "exclusion" amount on estates was $675,000 and the top tax rate was 55%. The exclusion amount steadily increased to $3.5 million by 2009 while the tax rate dropped to 45% when it was temporarily repealed in 2010.
The estate tax was reinstated in 2011 with a further increased cap of $5 million for individuals and $10 million for couples filing jointly and a reduced rate of 35%. The "step-up basis" of estate tax law allows a recipient of an estate or portion of an estate to have a tax basis in the property equal to the market value of the property.This enables recipients of an estate to sell it at market value without having paid any tax on it.
According to the Congressional Budget Office, this exemption costs the federal government $715 billion a year.
Sales tax:
Main article: Sales tax in the United States
Sales taxes are taxes placed on the sale or lease of goods and services in the United States. While no national general sales tax exists, the federal government levies several national selective sales taxes. States also may levy selective sales taxes on the sale or lease of particular goods or services. States may also delegate to local governments the authority to impose additional general or selective sales taxes.
Tax expenditures:
The term "tax expenditures" refers to income exclusions, deductions, preferential rates, and credits that reduce revenues for any given level of tax rates in the individual, payroll, and corporate income tax systems. Like conventional spending, they contribute to the federal budget deficit. They also influence choices about working, saving, and investing, and affect the distribution of income.
The amount of reduced federal revenues are significant, estimated by CBO at nearly 8% GDP or about $1.5 trillion in 2017, for scale roughly half the revenue collected by the government and nearly three times as large as the budget deficit. Since eliminating a tax expenditure changes economic behavior, the amount of additional revenue that would be generated is somewhat less than the estimated size of the tax expenditure.
CBO reported that the following were among the largest individual (non-corporate) tax expenditures in 2013:
In 2013, CBO estimated that more than half of the combined benefits of 10 major tax expenditures would apply to households in the top 20% income group, and that 17% of the benefit would go to the top 1% households. The top 20% of income earners pay about 70% of federal income taxes, excluding payroll taxes.
For scale, 50% of the $1.5 trillion in tax expenditures in 2016 was $750 billion, while the U.S. budget deficit was approximately $600 billion. In other words, eliminating the tax expenditures for the top 20% might balance the budget over the short-term, depending on economic feedback effects.
Credits and exemptions:
Education:
Further information: Educational attainment in the United States
Economist Gary Becker has described educational attainment as the root of economic mobility. The United States offers several tax incentives for education, such as the American Opportunity Tax Credit and Hope credit along with tax exemptions for scholarships and grants.
Those who do not qualify for such aid can obtain a low-interest student loan, which may be subsidized based on financial need, and tuition can often be deducted from the federal income tax. Such loans were created with the goal of encouraging greater social mobility and equality of opportunity.
According to Becker, the rise in returns on investments in human capital is beneficial and desirable to society because it increases productivity and standards of living. However, the cost for college tuition has increased significantly faster than inflation, leading the United States to have one of the most expensive higher education systems in the world.
It has been suggested that tax policy could be used to help reduce these costs, by taxing the endowment income of universities and linking the endowment tax to tuition rates. The United States spends about 7.3% of GDP ($1.1 trillion in 2011 - public and private, all levels) annually on education, with 70% funded publicly through varying levels of federal, state, and local taxation.
Healthcare:
Further information: Health insurance coverage in the United States
The United States tax code includes deductions and penalties with regard to health insurance coverage. The number of uninsured in the United States, many of whom are the working poor or unemployed, are one of the primary concerns raised by advocates of health care reform.
The costs of treating the uninsured must often be absorbed by providers as charity care, passed on to the insured via cost shifting and higher health insurance premiums, or paid by taxpayers through higher taxes. The federal income tax offers employers a deduction for amounts contributed health care plans.
In 2014, the Patient Protection and Affordable Care Act encourages states to expand Medicaid for low income households, funded by additional federal taxes. Some of the taxes specifically target wealthier households. Income from self-employment and wages of single individuals in excess of $200,000 annually will be subject to an additional tax of 0.9%.
The threshold amount is $250,000 for a married couple filing jointly (threshold applies to joint compensation of the two spouses), or $125,000 for a married person filing separately.
In addition, a Medicare tax of 3.8% will apply to unearned income, specifically the lesser of net investment income or the amount by which adjusted gross income exceeds $200,000 ($250,000 for a married couple filing jointly; $125,000 for a married person filing separately.)
In March 2018, the CBO reported that the ACA had reduced income inequality in 2014, saying that the law led the lowest and second quintiles (the bottom 40%) to receive an average of an additional $690 and $560 respectively while causing households in the top 1% to pay an additional $21,000 due mostly to the net investment income tax and the additional Medicare tax. The law placed relatively little burden on households in the top quintile (top 20%) outside of the top 1%.
Compression and divergence in tax code changes:
Princeton economics professor, Nobel laureate, and John Bates Clarke Award winner Paul Krugman argues that politics not economic conditions have made income inequality in the United States "unique" and to a degree that "other advanced countries have not seen."
According to Krugman, government action can either compress or widen income inequality through tax policy and other redistributive or transfer policies. Krugman illustrates this point by describing "The Great Compression" and "The Great Divergence."
Krugman states that the end of the Great Depression to the end of World War II, from 1939–1946, saw a rapid narrowing of the spread of the income distribution in America which effectively created the middle class. Krugman calls this economic time period "The Great Compression" because the income distribution was compressed.
Krugman attributes this phenomenon to intrinsically equalizing economic policy such as increased tax rates on the wealthy, higher corporate tax rates, a pro-union organizing environment, minimum wage, Social Security, unemployment insurance, and "extensive government controls on the economy that were used in a way that tended to equalize incomes."
This "artificial[ly]" created middle class endured due to the creation of middle class institutions, norms, and expectations that promoted income equality. Krugman believes this period ends in 1980, which he points out as being "interesting" because it was when "Reagan came to the White House."
From 1980 to the present, Krugman believes income inequality was uniquely shaped by the political environment and not the global economic environment. For example, the U.S. and Canada both had approximately 30% of its workers in unions during the 1960s. However, by 2010, around 25% of Canadian workers were still unionized while 11% of American workers were unionized.
Krugman blames Reagan for this rapid decline in unionization because he "declared open season on unions" while the global market clearly made room for unions as Canada's high union rate proves.
Contrary to the arguments made by Chicago economists such as Gary Becker, Krugman points out that while the wealth gap between the college educated and non-college educated continues to grow, the largest rise in income inequality is between the well-educated-college graduates and college graduates, and not between college graduates and non-college graduates.
The average high school teacher, according to Krugman, has a post-graduate degree which is a comparable level of education to a hedge fund manager whose income is several times that of the average high school teacher. In 2006, the "highest paid hedge fund manager in the United States made an amount equal to the salaries of all 80,000 New York City school teachers for the next three years."
Accordingly, Krugman believes that education and a shifting global market are not the sole causes of increased income inequalities since the 1980s but rather that politics and the implementation of conservative ideology has aggregated wealth to the rich. Some of these political policies include the Reagan tax cuts in 1981 and 1986.
Nobel laureate Joseph Stiglitz asserts in a Vanity Fair article published in May 2011 entitled "Of the 1%, by the 1%, for the 1%" that "preferential tax treatment for special interests" has helped increase income inequality in the United States as well as reduced the efficiency of the market. He specifically points to the reduction in capital gains over the last few years, which are "how the rich receive a large portion of their income," as giving the wealthy a "free ride."
Stiglitz criticizes the "marginal productivity theory" saying that the largest gains in wages are going toward in his opinion, less than worthy occupations such as finance whose effects have been "massively negative." Accordingly, if income inequality is predominately explained by rising marginal productivity of the educated then why are financiers, who are responsible for bringing the U.S. economy "to the brink of ruin."
Thomas Piketty and Emmanuel Saez wrote in their work "Income Inequality in the United States,1913–1998" that "top income and wages shares(in the United States) display a U-shaped pattern over the century" and "that the large shocks that capital owners experienced during the Great Depression and World War II have had a permanent effect on top capital incomes...that steep progressive income and estate taxation may have prevented large fortunes from recovery form the shocks."
Saez and Piketty argue that the "working rich" are now at the top of the income ladder in the United States and their wealth far out-paces the rest of the country. Piketty and Saez plotted the percentage share of total income accrued by the top 1%, top 5%, and the top 10% of wage earners in the United States from 1913-2008. According to their data, the top 1% controlled 10% of the total income while the top 5% owned approximately 13% and the top 10% possessed around 12% of total income.
By 1984, the percentage of total income owned by the top 1% rose from 10% to 16% while income shares of the top 5% and top 10% controlled 13.5% and 12%, respectively. The growth in income for the top 1% then rose up to 22% by 1998 while the income growth rates for the top 5% and top 10% remained constant (15% total share of income and 12% total share of income, respectively).
The percentage share of total income owned by the top 1% fell to 16% during the post-9/11 recession but then re-rose to its 1998 level by 2008. In 2008, the wealth gap in terms of percentage of total income in the United States between the top 1% and 5% was 7% and the gap between the top 1% and top 10% was 9%. This is an 11% reversal from the respective percentage shares of income held by these groups in 1963. Income inequality clearly accelerated beginning in the 1980s.
Larry Bartels, a Princeton political scientist and the author of Unequal Democracy, argues that federal tax policy since 1964 and starting even before that has increased economic inequality in the United States. He states that the real income growth rate for low and middle class workers is significantly smaller under Republican administrations than it is under Democratic administrations while the real income growth rate for the upper class is much larger under Republican administrations than it is for Democratic administrations.
Bartels finds that from 1948 to 2005, pre-tax real income growth for the bottom 20% grew by 1.42% while pre-tax real income growth for the top 20% grew by 2%. Under the Democratic administrations in this time period, (Truman, Kennedy, Johnson, Carter, and Clinton) the pre-tax real income growth rate for the bottom 20% was 2.64% while the pre-tax real income growth rate for the top 20% was 2.12%.
During the Republican administrations of this time period (Eisenhower, Nixon, Ford, Reagan, G. H. W. Bush, and G. W. Bush), the pre-tax real income growth rate was 0.43% for the bottom 20% and 1.90% for the top 20%.
The disparity under Democratic presidents in this time period between the top and bottom 20% pre-tax real income growth rate was -0.52% while the disparity under Republican presidents was 1.47%.
The pre-tax real income growth rate for the wealthiest 40%, 60%, and 80% of population was higher under the Democratic administrations than it was under the Republican administrations in this time period. The United States was more equal and growing wealthier, based on income, under Democratic Presidents from 1948-2005 than it was under Republican Presidents in the same time period.
Additionally, Bartels believes that the reduction and the temporary repeal of the estate tax also increased income inequality by benefiting almost exclusively the wealthiest in America.
According to a working paper released by the Society for the Study of Economic Inequality entitled "Tax policy and income inequality in the U.S.,1978—2009: A decomposition approach," tax policy can either exacerbate or curtail economic inequality.
This article argues that tax policy reforms passed under Republican administrations since 1979 have increased economic inequality while Democratic administrations during the same time period have reduced economic inequality. The net vector movement of tax reforms on economic inequality since 1979 is essentially zero as the opposing policies neutralized each other.
Policy responses:
Public policy responses addressing causes and effects of income inequality include:
Taxes on the wealthy:
The Congressional Budget Office reported that less progressive tax and transfer policies contributed to an increase in after-tax income inequality between 1979 and 2007. This indicates that more progressive income tax policies (e.g., higher income taxes on the wealthy and a higher earned-income tax credit) would reduce after-tax income inequality.
In their World Inequality Report published in December 2017, Piketty, Saez and coauthors revealed that in "Russia and the United States, the rise in wealth inequality has been extreme, whereas in Europe it has been more moderate."
They reported that the tax system in the United States, along with "massive educational inequalities", have grown "less progressive despite a surge in top labor compensation since the 1980s, and in top capital incomes in the 2000s."
The "top 1% income share was close to 10% in the [US and Europe] in 1980, it rose only slightly to 12% in 2016 in Western Europe [where taxation and education policies are more progressive] while it shot up to 20% in the United States."
The "bottom 50% income share decreased from more than 20% in 1980 to 13% in 2016." In 2012, the economists Emmanuel Saez and Thomas Piketty had recommended much higher top marginal tax rates on the wealthy, up to 50 percent, 70 percent or even 90 percent.
Ralph Nader, Jeffrey Sachs, the United Front Against Austerity, among others, call for a financial transactions tax (also known as the Robin Hood tax) to bolster the social safety net and the public sector.
The Pew Center reported in January 2014 that 54% of Americans supported raising taxes on the wealthy and corporations to expand aid to the poor. By party, 29% of Republicans and 75% of Democrats supported this action.
Senator Elizabeth Warren proposed an annual tax on wealth in January 2019, specifically a 2% tax for wealth over $50 million and another 1% surcharge on wealth over $1 billion. Wealth is defined as including all asset classes, including financial assets and real estate. Economists Emmanuel Saez and Gabriel Zucman estimated that about 75,000 households (less than 0.1%) would pay the tax.
The tax would raise around $2.75 trillion over 10 years, roughly 1% GDP on average per year. This would raise the total tax burden for those subject to the wealth tax from 3.2% of their wealth under current law to about 4.3% on average, versus the 7.2% for the bottom 99% families. For scale, the federal budget deficit in 2018 was 3.9% GDP and is expected to rise towards 5% GDP over the next decade. The plan received both praise and criticism.
Two billionaires, Michael Bloomberg and Howard Schultz, criticized the proposal as "unconstitutional" and "ridiculous," respectively. Warren was not surprised by this reaction, stating: "Another billionaire who thinks that billionaires shouldn't pay more in taxes."
Economist Paul Krugman wrote in January 2019 that polls indicate the idea of taxing the rich more is very popular.
Senators Charles Schumer and Bernie Sanders advocated limiting stock buybacks in January 2019. They explained that from 2008-2017, 466 of the S&P 500 companies spent $4 trillion on stock buybacks, about 50% of profits, with another 40% going to dividends.
During 2018 alone, a record $1 trillion was spent on buybacks. Stock buybacks shift wealth upwards, because the top 1% own about 40% of shares and the top 10% own about 85%.
Further, corporations directing profits to shareholders are not reinvesting the money in the firm or paying workers more. They wrote: "If corporations continue to purchase their own stock at this rate, income disparities will continue to grow, productivity will suffer, the long-term strength of companies will diminish — and the American worker will fall further behind."
Their proposed legislation would prohibit buybacks unless the corporation has taken other steps first, such as paying workers more, providing more benefits such as healthcare and pensions, and investing in the community. To prevent corporations from shifting from buybacks to dividends, they proposed limiting dividends, perhaps by taking action through the tax code.
See also:
Income tax rates applied to various income levels and tax expenditures (i.e., deductions, exemptions, and preferential rates that modify the outcome of the rate structure) primarily drive how market results are redistributed to impact the after-tax inequality. After-tax inequality has risen in the United States markedly since 1980, following a more egalitarian period following World War II.
Overview:
Tax policy is the mechanism through which market results are redistributed, affecting after-tax inequality. The provisions of the United States Internal Revenue Code regarding income taxes and estate taxes have undergone significant changes under both Republican and Democratic administrations and Congresses since 1964.
Since the Johnson Administration, the top marginal income tax rates have been reduced from 91% for the wealthiest Americans in 1963, to a low of 35% under George W Bush, rising recently to 39.6% (or in some cases 43.4%) in 2013 under the Obama Administration.
Capital gains taxes have also decreased over the last several years, and have experienced a more punctuated evolution than income taxes as significant and frequent changes to these rates occurred from 1981 to 2011. Both estate and inheritance taxes have been steadily declining since the 1990s.
Economic inequality in the United States (see previous topic) has been steadily increasing since the 1980s as well and economists such as Paul Krugman, Joseph Stiglitz, and Peter Orszag, politicians like Barack Obama and Paul Ryan, and media entities have engaged in debates and accusations over the role of tax policy changes in perpetuating economic inequality.
Tax expenditures (i.e., deductions, exemptions, and preferential tax rates) represent a major driver of inequality, as the top 20% get roughly 50% of the benefit from them, with the top 1% getting 17% of the benefit.
For example, a 2011 Congressional Research Service report stated, "Changes in capital gains and dividends were the largest contributor to the increase in the overall income inequality." CBO estimated tax expenditures would be $1.5 trillion in fiscal year 2017, approximately 8% GDP; for scale, the budget deficit historically has averaged around 3% GDP.
Scholarly and popular literature exists on this topic with numerous works on both sides of the debate. The work of Emmanuel Saez, for example, has concerned the role of American tax policy in aggregating wealth into the richest households in recent years while Thomas Sowell and Gary Becker maintain that education, globalization, and market forces are the root causes of income and overall economic inequality.
The Revenue Act of 1964 and the "Bush Tax Cuts" coincide with the rising economic inequality in the United States both by socioeconomic class and race
Changes in economic inequality:
Income inequality:
Main article: Income inequality in the United States
Economists and related experts have described America's growing income inequality as "deeply worrying", unjust, a danger to democracy/social stability, and a sign of national decline. Yale professor Robert Shiller, who was among three Americans who won the Nobel prize for economics in 2013, said after receiving the award, "The most important problem that we are facing now today, I think, is rising inequality in the United States and elsewhere in the world."
Inequality in land and income ownership is negatively correlated with subsequent economic growth. A strong demand for redistribution may occur in societies where a large section of the population does not have access to the productive resources of the economy. Voters may internalize such issues.
High unemployment rates have a significant negative effect when interacting with increases in inequality. Increasing inequality harms growth in countries with high levels of urbanization. High and persistent unemployment also has a negative effect on subsequent long-run economic growth.
Unemployment may seriously harm growth because it is a waste of resources, generates redistributive pressures and distortions, depreciates existing human capital and deters its accumulation, drives people to poverty, results in liquidity constraints that limit labor mobility, and because it erodes individual self-esteem and promotes social dislocation, unrest and conflict. Policies to control unemployment and reduce its inequality-associated effects can strengthen long-run growth.
Gini coefficient:
Main article: Gini coefficient
The Gini Coefficient, a statistical measurement of the inequality present in a nation's income distribution developed by Italian statistician and sociologist Corrado Gini, for the United States has increased over the last few decades. The closer the Gini Coefficient is to one, the closer its income distribution is to absolute inequality.
In 2007, the United Nations approximated the United States' Gini Coefficient at 41% while the CIA Factbook placed the coefficient at 45%. The United States' Gini Coefficient was below 40% in 1964 and slightly declined through the 1970s. However, around 1981, the Gini Coefficient began to increase and rose steadily through the 2000s.
Wealth distribution:
Main article: Wealth inequality in the United States
Wealth, in economic terms, is defined as the value of an individual's or household's total assets minus his or its total liabilities.
The components of wealth include assets, both monetary and non-monetary, and income. Wealth is accrued over time by savings and investment. Levels of savings and investment are determined by an individual's or a household's consumption, the market real interest rate, and income.
Individuals and households with higher incomes are more capable of saving and investing because they can set aside more of their disposable income to it while still optimizing their consumption functions. It is more difficult for lower-income individuals and households to save and invest because they need to use a higher percentage of their income for fixed and variable costs thus leaving them with a more limited amount of disposable income to optimize their consumption.
Accordingly, a natural wealth gap exists in any market as some workers earn higher wages and thus are able to divert more income towards savings and investment which build wealth.
The wealth gap in the United States is large and the large majority of net worth and financial wealth is concentrated in a relatively very small percentage of the population. Sociologist and University of California-Santa Cruz professor G. William Domhoff writes that "numerous studies show that the wealth distribution has been extremely concentrated throughout American history" and that "most Americans (high income or low income, female or male, young or old, Republican or Democrat) have no idea just how concentrated the wealth distribution actually is."
In 2007, the top 1% of households owned 34.6% of all privately held wealth and the next 19% possessed 50.5% of all privately held wealth. Taken together, 20% of Americans controlled 85.1% of all privately held wealth in the country.
In the same year, the top 1% of households also possessed 42.7% of all financial wealth and the top 19% owned 50.3% of all financial wealth in the country. Together, the top 20% of households owned 93% of the financial wealth in the United States. Financial wealth is defined as "net worth minus net equity in owner-occupied housing."
In real money terms and not just percentage share of wealth, the wealth gap between the top 1% and the other quartiles of the population is immense. The average wealth of households in the top 1% of the population was $13.977 million in 2009. This is fives times as large as the average household wealth for the next four percent (average household wealth of $2.7 million), fifteen times as large as the average household wealth for the next five percent (average household wealth of $908,000), and twenty-nine times the size of the average household wealth of the next ten percent of the population (average household wealth of $477,000) in the same year.
Comparatively, the average household wealth of the lowest quartile was -$27,000 and the average household wealth of the second quartile (bottom 20-40th percentile of the population) was $5,000. The middle class, the middle quartile of the population, has an average household wealth level of $65,000.
According to the Congressional Budget Office, the real, or inflation-adjusted, after-tax earnings of the wealthiest one percent of Americans grew by 275% from 1979 to 2007.
Simultaneously, the real, after-tax earnings of the bottom twenty percent of wage earnings in the United States grew 18%. The difference in the growth of real income of the top 1% and the bottom 20% of Americans was 257%. The average increase in real, after-tax income for all U.S. households during this time period was 62% which is slightly below the real, after-tax income growth rate of 65% experienced by the top 20% of wage earners, not accounting for the top 1%.
Data aggregated and analyzed by Robert B. Reich, Thomas Piketty, and Emmanuel Saez and released in a New York Times article written by Bill Marsh shows that real wages for production and non-supervisory workers, which account for 82% of the U.S. workforce, increased by 100% from 1947 to 1979 but then increased by only 8% from 1979–2009. Their data also shows that the bottom fifth experienced a 122% growth rate in wages from 1947 to 1979 but then experienced a negative growth rate of 4% in their real wages from 1979–2009.
The real wages of the top fifth rose by 99% and then 55% during the same periods, respectively. Average real hourly wages have also increased by a significantly larger rate for the top 20% than they have for the bottom 20%. Real family income for the bottom 20% increased by 7.4% from 1979 to 2009 while it increased by 49% for the top 20% and increased by 22.7% for the second top fifth of American families.
As of 2007, the United Nations estimated the ratio of average income for the top 10% to the bottom 10% of Americans, via the Gini Coefficient, as 15.9:1. The ratio of average income for the top 20% to the bottom 20% in the same year and using the same index was 8.4:1.
According to these UN statistics, the United States has the third highest disparity between the average income of the top 10% and 20% to the bottom 10% and bottom 20% of the population, respectively, of the OECD (Organization for Economic Co-operation and Development) countries.
Only Chile and Mexico have larger average income disparities between the top 10% and bottom 10% of the population with 26:1 and 23:1, respectively. Consequently, the United States has the fourth highest Gini Coefficient of the OECD countries at 40.8% which is lower than Chile's (52%), Mexico's (51%), and just lower than Turkey's (42%).
Tax structure:
A 2011 Congressional Research Service report stated, "Changes in capital gains and dividends were the largest contributor to the increase in the overall income inequality.
Taxes were less progressive in 2006 than in 1996, and consequently, tax policy also contributed to the increase in income inequality between 1996 and 2006. But overall income inequality would likely have increased even in the absence of tax policy changes."
Since 1964, the U.S. income tax, including the capital gains tax, has become less progressive (although recent changes have made the federal tax code the most progressive since 1979). The estate tax, a highly progressive tax, has also been reduced over the last decades.
A progressive tax code is believed to mitigate the effects of recessions by taking a smaller percentage of income from lower-income consumers than from other consumers in the economy so they can spend more of their disposable income on consumption and thus restore equilibrium. This is known as an automatic stabilizer as it does not need Congressional action such as legislation. It also mitigates inflation by taking more money from the wealthiest consumers so their large level of consumption does not create demand-driven inflation.
One argument against the view that tax policy increases income inequality is analysis of the overall share of wealth controlled by the top 1%.
Income Tax:
Main article: Income tax in the United States
The Revenue Act of 1964 was the first bill of the Post-World War II era to reduce marginal income tax rates.
This reform, which was proposed under John F. Kennedy but passed under Lyndon Johnson, reduced the top marginal income (annual income of $2.9 million+ adjusted for inflation) tax rate from 91% (for tax year 1963) to 77% (for tax year 1964) and 70% (for tax year 1965) for annual incomes of $1.4 million+.
It was the first tax legislation to reduce the top end of the marginal income tax rate distribution since 1924. The top marginal income tax rate had been 91% since 1946 and had not been below 70% since 1936.
The "Bush Tax Cuts," which are the popularly known names of the Economic Growth and Tax Relief Reconciliation Act of 2001 and the Jobs and Growth Tax Relief Reconciliation Act of 2003 passed during President George W. Bush's first term, reduced the top marginal income tax rate from 38.6% (annual income at $382,967+ adjusted for inflation) to 35%.
These rates were continued under the Obama Administration and will extend through 2013. The number of income tax brackets declined during this time period as well but several years, particularly after 1992, saw an increase in the number of income tax brackets. In 1964, there were 26 income tax brackets.
The number of brackets was reduced to 16 by 1981 and then collapsed into 13 brackets after passage of the Economic Recovery Tax Act of 1981. Five years later, the 13 income tax brackets were collapsed into five under the Reagan Administration. By the end of the G. H. W. Bush administration in 1992, the number of income tax brackets had reached an all-time low of three but President Bill Clinton oversaw a reconfiguration of the brackets that increased the number to five in 1993.
The current number of income tax brackets, as of 2011, is six which is the number of brackets configured under President George W. Bush.
The New York Times reported in July 2018 that: "The top-earning 1 percent of households — those earning more than $607,000 a year — will pay a combined $111 billion less this year in federal taxes than they would have if the laws had remained unchanged since 2000. That's an enormous windfall. It's more, in total dollars, than the tax cut received over the same period by the entire bottom 60 percent of earners." This represents the tax cuts for the top 1% from the Bush tax cuts and Trump tax cuts, partially offset by the tax increases on the top 1% by Obama.
Effective tax rates:
Ronald Reagan made very large reductions in the nominal marginal income tax rates with his Tax Reform Act of 1986, which did not make a similarly large reduction in the effective tax rate on marginal incomes.
Noah writes in his ten part series entitled "The Great Divergence," that "in 1979, the effective tax rate on the top 0.01 percent was 42.9 percent, according to the Congressional Budget Office, but by Reagan's last year in office it was 32.2%."
This effective rate held steadily until the first few years of the Clinton presidency when it increased to a peak high of 41%. However, it fell back down to the low 30s by his second term in the White House. This percentage reduction in the effective marginal income tax rate for the wealthiest Americans, 9%, is not a very large decrease in their tax burden, according to Noah, especially in comparison to the 20% drop in nominal rates from 1980 to 1981 and the 15% drop in nominal rates from 1986 to 1987.
In addition to this small reduction on the income taxes of the wealthiest taxpayers in America, Noah discovered that the effective income tax burden for the bottom 20% of wage earners was 8% in 1979 and dropped to 6.4% under the Clinton Administration.
This effective rate further dropped under the George W. Bush Administration. Under Bush, the rate decreased from 6.4% to 4.3%. Reductions in the effective income tax burden on the poor coinciding with modest reductions in the effective income tax rate on the wealthiest 0.01% of tax payers could not have been the driving cause of increased income inequality that began in the 1980s.
These figures are similar to an analysis of effective federal tax rates from 1979-2005 by the Congressional Budget Office. The figures show a decrease in the total effective tax rate from 37.0% in 1979 to 29% in 1989. The effective individual income tax rate dropped from 21.8% to 19.9% in 1989. However, by 2010, the top 1 percent of all households an average federal tax rate of 29.4 percent, with 2013 rates to be significantly higher.
Capital gains tax:
Main article: Capital gains tax in the United States
Capital gains are profits from investments in capital assets such as bonds, stocks, and real estate. These gains are taxed, for individuals, as ordinary income when held for less than one year which means that they have the same marginal tax rate as the marginal income tax rate of their recipient. This is known as the capital gains tax rate on a short-term capital gains.
Accordingly, the capital gains tax rate for short-term capital gains paid by an individual is equal to the marginal income tax rate of that individual. The tax rate then decreases once the capital gain becomes a long-term capital gain, or is held for 1 year or more.
In 1964, the effective capital gains tax rate was 25%. This means that the actual tax percentage of all capital gains realized in the U.S. in 1964 was 25% as opposed to the nominal capital gains tax rate, or the percentage that would have been collected by the government prior to deductions and evasions.
This effective rate held constant until a small rise in 1968 up to 26.9% and then began steadily increasing until it peaked at 39.875% in 1978. This top rate then fell to 28% in 1979 and further dropped to 20% in 1982.
This top capital gains rate held until 1986 when the Tax Reform Act of 1986 re-raised it to 28% and 33% for all individuals subject to phase-outs. The Tax Reform Act of 1986 shifted capital gains to income for the first time thus establishing equal short-term capital gains taxes and marginal income tax rates. The top rate of 28%, not taking into account taxpayers under the stipulations of a phase-out, remained until 1997, despite increases in marginal income tax rates, when it was lowered to 28%.
Starting in May 1997, however, long-term capital gains were divided into multiple subgroups based on the duration of time investors held them. Each new subgroup had a different tax rate. This effectively reduced the top capital gains tax rate on a long-term capital good held for over 1 year from 28% to 20%.
These multiple subgroups were reorganized into less than one year, one to five years, and five years or more and were in place from 1998 to 2003. In 2003, the divisions reverted to the less than one year and more than one year categories until 2011 when then reverted to the three divisions first implemented in 1998. This rate, 20%, remained until 2003 when it was further reduced to 15%.
The 15% long-term capital gains tax rate was then changed back to its 1997 rate of 20% in 2011. Capital gains taxes for the bottom two and top two income tax brackets have changed significantly since the late 1980s. The short-term and long-term capital gains tax rates for the bottom two tax rates, 15% and 28%, respectively, were equal to those tax payers' marginal income tax rates from 1988 until 1997.
In 1997, the capital gains tax rates for the bottom two income tax brackets were reduced to 10% and 20% for the 15% and 28% income tax brackets, respectively. These rates remained until 2001. President Bush made additional changes to the capital gains tax rates for the bottom two income tax brackets in 2001, which were lowered from 15% and 28% to 10% and 15%, respectively, by lowering the tax on long-term capital gains held for more than five years from 10% to 8%.
Bush also reduced the tax on short-term capital gains from 28% to 15% for the 15% tax bracket as well as lowered the tax on long-term capital goods from 20% to 10%. In 2003, the capital gains tax on long-term capital goods decreased from 10% to 5% for both of the bottom two tax brackets (10% and 15%). In 2008, these same rates were dropped to 0% but were restored to the 2003 rates in 2011 under President Obama via the extension of the Bush Tax Cuts.
Overall, capital gains tax rates decreased significantly for both the bottom two and the top two income tax brackets. The top two income tax brackets have had a net decrease in their long-term capital gains tax rates of 13% since 1988, while the lowest two income tax brackets' long-term capital gains tax rates have changed by 10% and 13%, respectively, in that time.
The difference between income and long-term capital gains taxes for the top two income tax brackets (5% in 1988 and 18% and 20%, respectively, in 2011), however, is larger than the difference between the income and long-term capital gains tax rates for the bottom two income tax brackets (0% in 1988 and 5% and 10%, respectively, in 2011). As of the 2013 tax year, all investment income for high earning households will be subject to a 3.8% surtax bringing the top capital gains rate to 23.8%.
Gift tax:
Main article: Gift tax in the United States
The inheritance tax, which is also known as the "gift tax", has been altered in the Post-World War II era as well.
First established in 1932 as a means to raise tax revenue from the wealthiest Americans, the inheritance tax was put at a nominal rate of 25% points lower than the estate tax which meant its effective rate was 18.7%. Its exemption, up to $50,000, was the same as the estate tax exemption.
Under current law, individuals can give gifts of up to $13,000 without incurring a tax and couples can poll their gift together to give a gift of up to $26,000 a year without incurring a tax. The lifetime gift tax exemption is $5 million which is the same amount as the estate tax exemption.
These two exemptions are directly tied to each other as the amount exempted from one reduces the amount that can be exempted from the other at a 1:1 ratio. The inheritance/gift tax generally affects a very small percentage of the population as most citizens do not inherit anything from their deceased relatives in any given year.
In 2000, the Federal Reserve Bank of Cleveland published a report that found that 1.6% of Americans received an inheritance of $100,000 or more and an additional 1.1% received an inheritance worth $50,000 to $100,000 while 91.9% of Americans did not receive an inheritance.
A 2010 report conducted by Citizens for Tax Justice found that only 0.6% of the population would pass on an inheritance in the event of death in that fiscal year. Accordingly, data shows that inheritance taxes are a tax almost exclusively on the wealthy. In 1986, Congress enacted legislation to prevent trust funds of wealthy individuals from skipping a generation before taxes had to be paid on the inheritance.
Estate tax:
Main article: Estate tax in the United States
Estate taxes, while affecting more taxpayers than inheritance taxes, do not affect many Americans and are also considered to be a tax aimed at the wealthy. In 2007, all of the state governments combined collected $22 billion in tax receipts from estate taxes and these taxes affected less than 5% of the population including less than 1% of citizens in every state.
In 2004, the average tax burden of the federal estate tax was 0% for the bottom 80% of the population by household. The average tax burden of the estate tax for the top 20% was $1,362. The table below gives a general impression of the spread of estate taxes by income.
A certain dollar amount of every estate can be exempted from tax, however. For example, if the government allows an exemption of up to $2 million on an estate then the tax on a $4 million estate would only be paid on $2 million worth of that estate, not all $4 million.
This reduces the effective estate tax rate. In 2001, the "exclusion" amount on estates was $675,000 and the top tax rate was 55%. The exclusion amount steadily increased to $3.5 million by 2009 while the tax rate dropped to 45% when it was temporarily repealed in 2010.
The estate tax was reinstated in 2011 with a further increased cap of $5 million for individuals and $10 million for couples filing jointly and a reduced rate of 35%. The "step-up basis" of estate tax law allows a recipient of an estate or portion of an estate to have a tax basis in the property equal to the market value of the property.This enables recipients of an estate to sell it at market value without having paid any tax on it.
According to the Congressional Budget Office, this exemption costs the federal government $715 billion a year.
Sales tax:
Main article: Sales tax in the United States
Sales taxes are taxes placed on the sale or lease of goods and services in the United States. While no national general sales tax exists, the federal government levies several national selective sales taxes. States also may levy selective sales taxes on the sale or lease of particular goods or services. States may also delegate to local governments the authority to impose additional general or selective sales taxes.
Tax expenditures:
The term "tax expenditures" refers to income exclusions, deductions, preferential rates, and credits that reduce revenues for any given level of tax rates in the individual, payroll, and corporate income tax systems. Like conventional spending, they contribute to the federal budget deficit. They also influence choices about working, saving, and investing, and affect the distribution of income.
The amount of reduced federal revenues are significant, estimated by CBO at nearly 8% GDP or about $1.5 trillion in 2017, for scale roughly half the revenue collected by the government and nearly three times as large as the budget deficit. Since eliminating a tax expenditure changes economic behavior, the amount of additional revenue that would be generated is somewhat less than the estimated size of the tax expenditure.
CBO reported that the following were among the largest individual (non-corporate) tax expenditures in 2013:
- The exclusion from workers' taxable income of employers' contributions for health care, health insurance premiums, and premiums for long-term care insurance ($248B);
- The exclusion of contributions to and the earnings of pension funds such as 401k plans ($137B);
- Preferential tax rates on dividends and long-term capital gains ($161B); and
- The deductions for state and local taxes ($77B), mortgage interest ($70B) and charitable contributions ($39B).
In 2013, CBO estimated that more than half of the combined benefits of 10 major tax expenditures would apply to households in the top 20% income group, and that 17% of the benefit would go to the top 1% households. The top 20% of income earners pay about 70% of federal income taxes, excluding payroll taxes.
For scale, 50% of the $1.5 trillion in tax expenditures in 2016 was $750 billion, while the U.S. budget deficit was approximately $600 billion. In other words, eliminating the tax expenditures for the top 20% might balance the budget over the short-term, depending on economic feedback effects.
Credits and exemptions:
Education:
Further information: Educational attainment in the United States
Economist Gary Becker has described educational attainment as the root of economic mobility. The United States offers several tax incentives for education, such as the American Opportunity Tax Credit and Hope credit along with tax exemptions for scholarships and grants.
Those who do not qualify for such aid can obtain a low-interest student loan, which may be subsidized based on financial need, and tuition can often be deducted from the federal income tax. Such loans were created with the goal of encouraging greater social mobility and equality of opportunity.
According to Becker, the rise in returns on investments in human capital is beneficial and desirable to society because it increases productivity and standards of living. However, the cost for college tuition has increased significantly faster than inflation, leading the United States to have one of the most expensive higher education systems in the world.
It has been suggested that tax policy could be used to help reduce these costs, by taxing the endowment income of universities and linking the endowment tax to tuition rates. The United States spends about 7.3% of GDP ($1.1 trillion in 2011 - public and private, all levels) annually on education, with 70% funded publicly through varying levels of federal, state, and local taxation.
Healthcare:
Further information: Health insurance coverage in the United States
The United States tax code includes deductions and penalties with regard to health insurance coverage. The number of uninsured in the United States, many of whom are the working poor or unemployed, are one of the primary concerns raised by advocates of health care reform.
The costs of treating the uninsured must often be absorbed by providers as charity care, passed on to the insured via cost shifting and higher health insurance premiums, or paid by taxpayers through higher taxes. The federal income tax offers employers a deduction for amounts contributed health care plans.
In 2014, the Patient Protection and Affordable Care Act encourages states to expand Medicaid for low income households, funded by additional federal taxes. Some of the taxes specifically target wealthier households. Income from self-employment and wages of single individuals in excess of $200,000 annually will be subject to an additional tax of 0.9%.
The threshold amount is $250,000 for a married couple filing jointly (threshold applies to joint compensation of the two spouses), or $125,000 for a married person filing separately.
In addition, a Medicare tax of 3.8% will apply to unearned income, specifically the lesser of net investment income or the amount by which adjusted gross income exceeds $200,000 ($250,000 for a married couple filing jointly; $125,000 for a married person filing separately.)
In March 2018, the CBO reported that the ACA had reduced income inequality in 2014, saying that the law led the lowest and second quintiles (the bottom 40%) to receive an average of an additional $690 and $560 respectively while causing households in the top 1% to pay an additional $21,000 due mostly to the net investment income tax and the additional Medicare tax. The law placed relatively little burden on households in the top quintile (top 20%) outside of the top 1%.
Compression and divergence in tax code changes:
Princeton economics professor, Nobel laureate, and John Bates Clarke Award winner Paul Krugman argues that politics not economic conditions have made income inequality in the United States "unique" and to a degree that "other advanced countries have not seen."
According to Krugman, government action can either compress or widen income inequality through tax policy and other redistributive or transfer policies. Krugman illustrates this point by describing "The Great Compression" and "The Great Divergence."
Krugman states that the end of the Great Depression to the end of World War II, from 1939–1946, saw a rapid narrowing of the spread of the income distribution in America which effectively created the middle class. Krugman calls this economic time period "The Great Compression" because the income distribution was compressed.
Krugman attributes this phenomenon to intrinsically equalizing economic policy such as increased tax rates on the wealthy, higher corporate tax rates, a pro-union organizing environment, minimum wage, Social Security, unemployment insurance, and "extensive government controls on the economy that were used in a way that tended to equalize incomes."
This "artificial[ly]" created middle class endured due to the creation of middle class institutions, norms, and expectations that promoted income equality. Krugman believes this period ends in 1980, which he points out as being "interesting" because it was when "Reagan came to the White House."
From 1980 to the present, Krugman believes income inequality was uniquely shaped by the political environment and not the global economic environment. For example, the U.S. and Canada both had approximately 30% of its workers in unions during the 1960s. However, by 2010, around 25% of Canadian workers were still unionized while 11% of American workers were unionized.
Krugman blames Reagan for this rapid decline in unionization because he "declared open season on unions" while the global market clearly made room for unions as Canada's high union rate proves.
Contrary to the arguments made by Chicago economists such as Gary Becker, Krugman points out that while the wealth gap between the college educated and non-college educated continues to grow, the largest rise in income inequality is between the well-educated-college graduates and college graduates, and not between college graduates and non-college graduates.
The average high school teacher, according to Krugman, has a post-graduate degree which is a comparable level of education to a hedge fund manager whose income is several times that of the average high school teacher. In 2006, the "highest paid hedge fund manager in the United States made an amount equal to the salaries of all 80,000 New York City school teachers for the next three years."
Accordingly, Krugman believes that education and a shifting global market are not the sole causes of increased income inequalities since the 1980s but rather that politics and the implementation of conservative ideology has aggregated wealth to the rich. Some of these political policies include the Reagan tax cuts in 1981 and 1986.
Nobel laureate Joseph Stiglitz asserts in a Vanity Fair article published in May 2011 entitled "Of the 1%, by the 1%, for the 1%" that "preferential tax treatment for special interests" has helped increase income inequality in the United States as well as reduced the efficiency of the market. He specifically points to the reduction in capital gains over the last few years, which are "how the rich receive a large portion of their income," as giving the wealthy a "free ride."
Stiglitz criticizes the "marginal productivity theory" saying that the largest gains in wages are going toward in his opinion, less than worthy occupations such as finance whose effects have been "massively negative." Accordingly, if income inequality is predominately explained by rising marginal productivity of the educated then why are financiers, who are responsible for bringing the U.S. economy "to the brink of ruin."
Thomas Piketty and Emmanuel Saez wrote in their work "Income Inequality in the United States,1913–1998" that "top income and wages shares(in the United States) display a U-shaped pattern over the century" and "that the large shocks that capital owners experienced during the Great Depression and World War II have had a permanent effect on top capital incomes...that steep progressive income and estate taxation may have prevented large fortunes from recovery form the shocks."
Saez and Piketty argue that the "working rich" are now at the top of the income ladder in the United States and their wealth far out-paces the rest of the country. Piketty and Saez plotted the percentage share of total income accrued by the top 1%, top 5%, and the top 10% of wage earners in the United States from 1913-2008. According to their data, the top 1% controlled 10% of the total income while the top 5% owned approximately 13% and the top 10% possessed around 12% of total income.
By 1984, the percentage of total income owned by the top 1% rose from 10% to 16% while income shares of the top 5% and top 10% controlled 13.5% and 12%, respectively. The growth in income for the top 1% then rose up to 22% by 1998 while the income growth rates for the top 5% and top 10% remained constant (15% total share of income and 12% total share of income, respectively).
The percentage share of total income owned by the top 1% fell to 16% during the post-9/11 recession but then re-rose to its 1998 level by 2008. In 2008, the wealth gap in terms of percentage of total income in the United States between the top 1% and 5% was 7% and the gap between the top 1% and top 10% was 9%. This is an 11% reversal from the respective percentage shares of income held by these groups in 1963. Income inequality clearly accelerated beginning in the 1980s.
Larry Bartels, a Princeton political scientist and the author of Unequal Democracy, argues that federal tax policy since 1964 and starting even before that has increased economic inequality in the United States. He states that the real income growth rate for low and middle class workers is significantly smaller under Republican administrations than it is under Democratic administrations while the real income growth rate for the upper class is much larger under Republican administrations than it is for Democratic administrations.
Bartels finds that from 1948 to 2005, pre-tax real income growth for the bottom 20% grew by 1.42% while pre-tax real income growth for the top 20% grew by 2%. Under the Democratic administrations in this time period, (Truman, Kennedy, Johnson, Carter, and Clinton) the pre-tax real income growth rate for the bottom 20% was 2.64% while the pre-tax real income growth rate for the top 20% was 2.12%.
During the Republican administrations of this time period (Eisenhower, Nixon, Ford, Reagan, G. H. W. Bush, and G. W. Bush), the pre-tax real income growth rate was 0.43% for the bottom 20% and 1.90% for the top 20%.
The disparity under Democratic presidents in this time period between the top and bottom 20% pre-tax real income growth rate was -0.52% while the disparity under Republican presidents was 1.47%.
The pre-tax real income growth rate for the wealthiest 40%, 60%, and 80% of population was higher under the Democratic administrations than it was under the Republican administrations in this time period. The United States was more equal and growing wealthier, based on income, under Democratic Presidents from 1948-2005 than it was under Republican Presidents in the same time period.
Additionally, Bartels believes that the reduction and the temporary repeal of the estate tax also increased income inequality by benefiting almost exclusively the wealthiest in America.
According to a working paper released by the Society for the Study of Economic Inequality entitled "Tax policy and income inequality in the U.S.,1978—2009: A decomposition approach," tax policy can either exacerbate or curtail economic inequality.
This article argues that tax policy reforms passed under Republican administrations since 1979 have increased economic inequality while Democratic administrations during the same time period have reduced economic inequality. The net vector movement of tax reforms on economic inequality since 1979 is essentially zero as the opposing policies neutralized each other.
Policy responses:
Public policy responses addressing causes and effects of income inequality include:
- progressive tax incidence adjustments,
- strengthening social safety net provisions such as:
- increasing and reforming higher education subsidies,
- increasing infrastructure spending, and placing limits on and taxing rent-seeking.
Taxes on the wealthy:
The Congressional Budget Office reported that less progressive tax and transfer policies contributed to an increase in after-tax income inequality between 1979 and 2007. This indicates that more progressive income tax policies (e.g., higher income taxes on the wealthy and a higher earned-income tax credit) would reduce after-tax income inequality.
In their World Inequality Report published in December 2017, Piketty, Saez and coauthors revealed that in "Russia and the United States, the rise in wealth inequality has been extreme, whereas in Europe it has been more moderate."
They reported that the tax system in the United States, along with "massive educational inequalities", have grown "less progressive despite a surge in top labor compensation since the 1980s, and in top capital incomes in the 2000s."
The "top 1% income share was close to 10% in the [US and Europe] in 1980, it rose only slightly to 12% in 2016 in Western Europe [where taxation and education policies are more progressive] while it shot up to 20% in the United States."
The "bottom 50% income share decreased from more than 20% in 1980 to 13% in 2016." In 2012, the economists Emmanuel Saez and Thomas Piketty had recommended much higher top marginal tax rates on the wealthy, up to 50 percent, 70 percent or even 90 percent.
Ralph Nader, Jeffrey Sachs, the United Front Against Austerity, among others, call for a financial transactions tax (also known as the Robin Hood tax) to bolster the social safety net and the public sector.
The Pew Center reported in January 2014 that 54% of Americans supported raising taxes on the wealthy and corporations to expand aid to the poor. By party, 29% of Republicans and 75% of Democrats supported this action.
Senator Elizabeth Warren proposed an annual tax on wealth in January 2019, specifically a 2% tax for wealth over $50 million and another 1% surcharge on wealth over $1 billion. Wealth is defined as including all asset classes, including financial assets and real estate. Economists Emmanuel Saez and Gabriel Zucman estimated that about 75,000 households (less than 0.1%) would pay the tax.
The tax would raise around $2.75 trillion over 10 years, roughly 1% GDP on average per year. This would raise the total tax burden for those subject to the wealth tax from 3.2% of their wealth under current law to about 4.3% on average, versus the 7.2% for the bottom 99% families. For scale, the federal budget deficit in 2018 was 3.9% GDP and is expected to rise towards 5% GDP over the next decade. The plan received both praise and criticism.
Two billionaires, Michael Bloomberg and Howard Schultz, criticized the proposal as "unconstitutional" and "ridiculous," respectively. Warren was not surprised by this reaction, stating: "Another billionaire who thinks that billionaires shouldn't pay more in taxes."
Economist Paul Krugman wrote in January 2019 that polls indicate the idea of taxing the rich more is very popular.
Senators Charles Schumer and Bernie Sanders advocated limiting stock buybacks in January 2019. They explained that from 2008-2017, 466 of the S&P 500 companies spent $4 trillion on stock buybacks, about 50% of profits, with another 40% going to dividends.
During 2018 alone, a record $1 trillion was spent on buybacks. Stock buybacks shift wealth upwards, because the top 1% own about 40% of shares and the top 10% own about 85%.
Further, corporations directing profits to shareholders are not reinvesting the money in the firm or paying workers more. They wrote: "If corporations continue to purchase their own stock at this rate, income disparities will continue to grow, productivity will suffer, the long-term strength of companies will diminish — and the American worker will fall further behind."
Their proposed legislation would prohibit buybacks unless the corporation has taken other steps first, such as paying workers more, providing more benefits such as healthcare and pensions, and investing in the community. To prevent corporations from shifting from buybacks to dividends, they proposed limiting dividends, perhaps by taking action through the tax code.
See also:
Citizenship in the United States including Immigrants who wish to become Citizens Pictured: Swearing in of New American Citizens
Citizenship of the United States is a status that entails specific rights, duties and benefits. Citizenship is understood as a "right to have rights" since it serves as a foundation of fundamental rights derived from and protected by the Constitution and laws of the United States, such as the rights to freedom of expression, vote, due process, live and work in the United States, and to receive federal assistance.
The implementation of citizenship requires attitudes including allegiance to the republic, participation, and an impulse to promote communities. Certain rights are so fundamental that they are guaranteed to all persons, not just citizens. These include those rights guaranteed by the first 8 Amendments that pertain to individuals. However, not all U.S. citizens, such as those living in Puerto Rico, have the right to vote in federal elections.
There are two primary sources of citizenship: birthright citizenship, in which a person is presumed to be a citizen if he or she was born within the territorial limits of the United States, or—providing certain other requirements are met—born abroad to a U.S. citizen parent, and naturalization, a process in which an eligible legal immigrant applies for citizenship and is accepted.
These two pathways to citizenship are specified in the Citizenship Clause of the Constitution's 1868 Fourteenth Amendment which reads: "All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside.
— 14th AmendmentNational citizenship signifies membership in the country as a whole; state citizenship, in contrast, signifies a relation between a person and a particular state and has application generally limited to domestic matters. State citizenship may affect (1) tax decisions and (2) eligibility for some state-provided benefits such as higher education and (3) eligibility for state political posts such as U.S. senator.
In Article One of the Constitution, the power to establish a "uniform rule of naturalization" is granted explicitly to Congress.
U.S. law permits multiple citizenship. A citizen of another country naturalized as a U.S. citizen may retain their previous citizenship, though they must renounce allegiance to the other country. A U.S. citizen retains U.S. citizenship when becoming the citizen of another country, should that country's laws allow it. U.S. citizenship can be renounced by Americans who also hold another citizenship via a formal procedure at a U.S. embassy.
Click on any of the following blue hyperlinks for more about Citizenship of the United States:
United States Nationality Law:
The United States nationality law refers to the uniform rule of naturalization of the United States set out in the Immigration and Nationality Act of 1952, enacted under the power of Article I, section 8, clause 4 of the United States Constitution (also referred to as the Nationality Clause), which grants the Congress the power to "establish a uniform Rule of Naturalization..."
The 1952 Act sets forth the legal requirements for the acquisition of, and divestiture from, American nationality. The requirements have become more explicit since the ratification of the Fourteenth Amendment to the Constitution, with the most recent changes to the law having been made by Congress in 2001.
Click on any of the following blue hyperlinks for more about the United States Nationality Law:
The implementation of citizenship requires attitudes including allegiance to the republic, participation, and an impulse to promote communities. Certain rights are so fundamental that they are guaranteed to all persons, not just citizens. These include those rights guaranteed by the first 8 Amendments that pertain to individuals. However, not all U.S. citizens, such as those living in Puerto Rico, have the right to vote in federal elections.
There are two primary sources of citizenship: birthright citizenship, in which a person is presumed to be a citizen if he or she was born within the territorial limits of the United States, or—providing certain other requirements are met—born abroad to a U.S. citizen parent, and naturalization, a process in which an eligible legal immigrant applies for citizenship and is accepted.
These two pathways to citizenship are specified in the Citizenship Clause of the Constitution's 1868 Fourteenth Amendment which reads: "All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside.
— 14th AmendmentNational citizenship signifies membership in the country as a whole; state citizenship, in contrast, signifies a relation between a person and a particular state and has application generally limited to domestic matters. State citizenship may affect (1) tax decisions and (2) eligibility for some state-provided benefits such as higher education and (3) eligibility for state political posts such as U.S. senator.
In Article One of the Constitution, the power to establish a "uniform rule of naturalization" is granted explicitly to Congress.
U.S. law permits multiple citizenship. A citizen of another country naturalized as a U.S. citizen may retain their previous citizenship, though they must renounce allegiance to the other country. A U.S. citizen retains U.S. citizenship when becoming the citizen of another country, should that country's laws allow it. U.S. citizenship can be renounced by Americans who also hold another citizenship via a formal procedure at a U.S. embassy.
Click on any of the following blue hyperlinks for more about Citizenship of the United States:
- Rights, duties, and benefits
- Civic participation
- Dual citizenship
- History of citizenship in the United States
- Birthright citizenship
- Naturalized citizenship
- Honorary citizenship
- Corporate citizenship
- Distinction between citizenship and nationality
- Controversies
- Relinquishment of citizenship
- See also:
- Accidental American
- Anchor baby
- Birth tourism
- Birthright citizenship in the United States of America
- Birthright generation
- Citizenship (general discussion for all nations)
- Citizenship education
- DREAM Act
- History of citizenship
- Jus soli
- Natural born citizen of the United States
- Undocumented students in the United States
- Undocumented youth in the United States
United States Nationality Law:
The United States nationality law refers to the uniform rule of naturalization of the United States set out in the Immigration and Nationality Act of 1952, enacted under the power of Article I, section 8, clause 4 of the United States Constitution (also referred to as the Nationality Clause), which grants the Congress the power to "establish a uniform Rule of Naturalization..."
The 1952 Act sets forth the legal requirements for the acquisition of, and divestiture from, American nationality. The requirements have become more explicit since the ratification of the Fourteenth Amendment to the Constitution, with the most recent changes to the law having been made by Congress in 2001.
Click on any of the following blue hyperlinks for more about the United States Nationality Law:
- Rights and responsibilities of U.S. citizens
- Acquisition of citizenship
- Dual citizenship
- Travel freedom of American citizens
- Nationals
- Citizenship at birth on the U.S. territories and former U.S. territories
- Loss of citizenship
- Emigration from United States
- See also:
- Honorary Citizen of the United States
- Natural-born citizen
- United States v. Matheson
- Immigration and Nationality Act
- U.S. Citizenship Information (USCIS)
- U.S. Naturalization (USCIS)
- U.S. Citizenship Laws & Policy (U.S. State Department)
- U.S. regulations regarding loss and restoration of citizenship
- Free Online U.S. Citizenship Practice Test (USCIS)
Equal Employment Opportunity Commission
- YouTube Video about the Equal Employment Opportunity Commission
- YouTube Video: Human Resource Basics: Equal Employment Opportunity
- YouTube Video: Legally Speaking: How to File with the EEOC - Equal Employment Opportunity Commission
The U.S. Equal Employment Opportunity Commission (EEOC) is a federal agency that administers and enforces civil rights laws against workplace discrimination.
The EEOC investigates discrimination complaints based on an individual's race, children, national origin, religion, sex, age, disability, sexual orientation, gender identity, genetic information, and retaliation for reporting, participating in, and/or opposing a discriminatory practice.
Click on any of the following blue hyperlinks for more about the U.S. Equal Employment Opportunity Commission (EEOC):
The EEOC investigates discrimination complaints based on an individual's race, children, national origin, religion, sex, age, disability, sexual orientation, gender identity, genetic information, and retaliation for reporting, participating in, and/or opposing a discriminatory practice.
Click on any of the following blue hyperlinks for more about the U.S. Equal Employment Opportunity Commission (EEOC):
- History
- Staffing, workload, and backlog
- Race and ethnicity
- Investigative compliance policy
- Increase in disability-based charges
- Home Depot disability discrimination suit
- 2012 profile
- Successes
- Criticism
- General Counsels
- Commissioners
- Chairs
- See also
- Official website
- Proposed and finalized federal regulations from the Equal Employment Opportunity Commission
- Role of Equal employment opportunity commission
- Records of the Equal Employment Opportunity Commission in the National Archives (Record Group 403)
- nytimes.com, discusses the fairly recent case involving allegations against Bloomberg unfairly treating pregnant women. Bloomberg won because of a lack of statistics on the Equal Employment Opportunity Commission's part. However, it is still possible for the federal government to appeal and the witnesses can individually sue Bloomberg for discrimination.
- https://en.wikisource.org/wiki/Federal_Civil_Penalties_Inflation_Adjustment_Act_of_1990
- Equal Pay Act of 1963
- Title 29 of the Code of Federal Regulations
- PATCOB
- Pregnancy discrimination
- Race and ethnicity (EEO)
- Katherine Pollak Ellickson
- USA.gov
- USAFacts
Democratic Ideals, including Living the American Dream
- YouTube Video: Democratic ideals in the preamble of the US Constitution
- YouTube Video: Living the American Dream: Greg Smith at TEDxCincy
- YouTube Video: immigrant Perspectives: The American Dream
Democratic ideals is an expression used to reflect personal qualities or standards of government behavior that are felt to be essential for the continuation of a democratic policy.
Advocates for causes across the political spectrum use this expression in attempting to engage in persuasion, particularly by contrasting some situation which has been allowed to continue for pragmatic or social reasons, but which those advocating an opportunity, and that equality is a democratic ideal.
Other times, advocates of one political outlook or another will use the expression to energize support among their constituencies, despite knowing that their political opponents use precisely the same phrase to do precisely the same thing.
Frequently the importance of human rights is listed as a central democratic ideal, as well as instilling in military and civilian governmental personnel the attitudes and methods which will prevent their actions from infringing on those rights.
Democratic ideals are often cited as a reason for patriotism, for example Woodrow Wilson's argument that America needed to enter World War I in order to make the world "safe for democracy".
Other uses of the term
In historical texts, the phrase is often used to denote aspirations or norms of behavior, separate from a functioning democracy, including egalitarianism, self-government, self-determination and freedom of conscience.
See also:
American Dream
The American Dream is a national ethos of the United States, the set of ideals (democracy, rights, liberty, opportunity and equality) in which freedom includes the opportunity for prosperity and success, as well as an upward social mobility for the family and children, achieved through hard work in a society with few barriers.
In the definition of the American Dream by James Truslow Adams in 1931, "life should be better and richer and fuller for everyone, with opportunity for each according to ability or achievement" regardless of social class or circumstances of birth.
The American Dream is rooted in the Declaration of Independence, which proclaims that "all men are created equal" with the right to "life, liberty and the pursuit of happiness." Also, the U.S. Constitution promotes similar freedom, in the Preamble: to "secure the Blessings of Liberty to ourselves and our Posterity".
Click on any of the following blue hyperlinks for more about "living the American Dream"
Advocates for causes across the political spectrum use this expression in attempting to engage in persuasion, particularly by contrasting some situation which has been allowed to continue for pragmatic or social reasons, but which those advocating an opportunity, and that equality is a democratic ideal.
Other times, advocates of one political outlook or another will use the expression to energize support among their constituencies, despite knowing that their political opponents use precisely the same phrase to do precisely the same thing.
Frequently the importance of human rights is listed as a central democratic ideal, as well as instilling in military and civilian governmental personnel the attitudes and methods which will prevent their actions from infringing on those rights.
Democratic ideals are often cited as a reason for patriotism, for example Woodrow Wilson's argument that America needed to enter World War I in order to make the world "safe for democracy".
Other uses of the term
In historical texts, the phrase is often used to denote aspirations or norms of behavior, separate from a functioning democracy, including egalitarianism, self-government, self-determination and freedom of conscience.
See also:
- Athenian democracy
- Civil rights
- Constitutional liberalism
- Democratic socialism
- Direct and indirect democracy
- Due process
- Egalitarianism
- Equality before the law
- Liberal democracy
- Natural rights
- Open society
- Pluralism (political philosophy)
- Popular sovereignty
- Social democracy
American Dream
The American Dream is a national ethos of the United States, the set of ideals (democracy, rights, liberty, opportunity and equality) in which freedom includes the opportunity for prosperity and success, as well as an upward social mobility for the family and children, achieved through hard work in a society with few barriers.
In the definition of the American Dream by James Truslow Adams in 1931, "life should be better and richer and fuller for everyone, with opportunity for each according to ability or achievement" regardless of social class or circumstances of birth.
The American Dream is rooted in the Declaration of Independence, which proclaims that "all men are created equal" with the right to "life, liberty and the pursuit of happiness." Also, the U.S. Constitution promotes similar freedom, in the Preamble: to "secure the Blessings of Liberty to ourselves and our Posterity".
Click on any of the following blue hyperlinks for more about "living the American Dream"
- History
- Literature
- Political leaders
- Public opinion
- Four dreams of consumerism
- Other parts of the world
- See also:
Civil Discourse and The Golden Rule
TOP: A Forum Space Dedicated to Public Debate and Civil Discourse
BOTTOM: The Top 5 Rules of Love
- YouTube Video: A framework for civil discourse about race and racism | Wornie Reed | TEDxVirginiaTech
- YouTube Video: Ben Shapiro: Civil Discourse | Real Time with Bill Maher (HBO)
- YouTube Video: The Golden Rule | Treat others with KINDNESS
TOP: A Forum Space Dedicated to Public Debate and Civil Discourse
BOTTOM: The Top 5 Rules of Love
[Your Web Host: These two topics were spliced together as the (largely) religious significance of the Golden Rule helps to make civil discourse more possible in civil exchanges between humans]
Civil discourse is engagement in discourse (conversation) intended to enhance understanding. Kenneth J. Gergen describes civil discourse as "the language of dispassionate objectivity", and suggests that it requires respect of the other participants, such as the reader.
It neither diminishes the other's moral worth, nor questions their good judgment; it avoids hostility, direct antagonism, or excessive persuasion; it requires modesty and an appreciation for the other participant's experiences.
In Book III of An Essay Concerning Human Understanding (1690), John Locke contrasts between civil and philosophical discourse (or rhetorical discourse) with the former being for the benefit of the reader, and the public good:
See also:
The Golden Rule is the principle of treating others as you want to be treated. It is a maxim that is found in many religions and cultures. It can be considered an ethic of reciprocity in some religions, although other religions treat it differently. The maxim may appear as a positive or negative injunction governing conduct:
The idea dates at least to the early Confucian times (551–479 BC), according to Rushworth Kidder, who identifies that this concept appears prominently in Buddhism, Christianity, Hinduism, Judaism, Taoism, Zoroastrianism, and "the rest of the world's major religions".
The concept of the Rule is codified in the Code of Hammurabi steele and tablets (1754-1790 BC). 143 leaders of the world's major faiths endorsed the Golden Rule as part of the 1993 "Declaration Toward a Global Ethic". According to Greg M. Epstein, it is "a concept that essentially no religion misses entirely", but belief in God is not necessary to endorse it. Simon Blackburn also states that the Golden Rule can be "found in some form in almost every ethical tradition".
The term "Golden Rule", or "Golden law", began to be used widely in the early 17th century in Britain by Anglican theologians and preachers; the earliest known usage is that of Anglicans Charles Gibbon and Thomas Jackson in 1604.
Click here for more about "The Golden Rule"
Civil discourse is engagement in discourse (conversation) intended to enhance understanding. Kenneth J. Gergen describes civil discourse as "the language of dispassionate objectivity", and suggests that it requires respect of the other participants, such as the reader.
It neither diminishes the other's moral worth, nor questions their good judgment; it avoids hostility, direct antagonism, or excessive persuasion; it requires modesty and an appreciation for the other participant's experiences.
In Book III of An Essay Concerning Human Understanding (1690), John Locke contrasts between civil and philosophical discourse (or rhetorical discourse) with the former being for the benefit of the reader, and the public good:
- “First, By, their civil use, I mean such a communication of thoughts and ideas by words, as may serve for the upholding common conversation and commerce, about the ordinary affairs and conveniences of civil life, in the societies of men, one amongst another. Secondly, by the philosophical use of words,
- I mean such a use of them as may serve to convey the precise notions of things, and to express in general propositions certain and undoubted truths, which the mind may rest upon and be satisfied with in its search after true knowledge. These two uses are very distinct; and a great deal less exactness will serve in the one than in the other, as we shall see in what follows."
See also:
- Rhetoric
- Civic virtue
- Etiquette
- Discourse community
- Self-censorship
- Speech code
- Citizens for Civil Discourse
- Dogmatism Versus Civil Discourse (2005), The Open Mind talk show with John Sexton (video)
The Golden Rule is the principle of treating others as you want to be treated. It is a maxim that is found in many religions and cultures. It can be considered an ethic of reciprocity in some religions, although other religions treat it differently. The maxim may appear as a positive or negative injunction governing conduct:
- Treat others as you would like others to treat you (positive or directive form)
- Do not treat others in ways that you would not like to be treated (negative or prohibitive form)
- What you wish upon others, you wish upon yourself (empathic or responsive form)
The idea dates at least to the early Confucian times (551–479 BC), according to Rushworth Kidder, who identifies that this concept appears prominently in Buddhism, Christianity, Hinduism, Judaism, Taoism, Zoroastrianism, and "the rest of the world's major religions".
The concept of the Rule is codified in the Code of Hammurabi steele and tablets (1754-1790 BC). 143 leaders of the world's major faiths endorsed the Golden Rule as part of the 1993 "Declaration Toward a Global Ethic". According to Greg M. Epstein, it is "a concept that essentially no religion misses entirely", but belief in God is not necessary to endorse it. Simon Blackburn also states that the Golden Rule can be "found in some form in almost every ethical tradition".
The term "Golden Rule", or "Golden law", began to be used widely in the early 17th century in Britain by Anglican theologians and preachers; the earliest known usage is that of Anglicans Charles Gibbon and Thomas Jackson in 1604.
Click here for more about "The Golden Rule"
The National Association for the Advancement of Colored People (NAACP) and "Black Lives Matter"
TOP: "Is the NAACP Becoming Irrelevant in 2018?"
BOTTOM: "The meaning of Black Lives Matter, one year later"
- YouTube Video How the NAACP Fights Racial Discrimination | History
- YouTube Video: A Look Into The Movement's History | Long Story Short | NBC News
- YouTube Video: Obama urges Americans to reject racist language from top
TOP: "Is the NAACP Becoming Irrelevant in 2018?"
BOTTOM: "The meaning of Black Lives Matter, one year later"
The National Association for the Advancement of Colored People (NAACP) is a civil rights organization in the United States, formed in 1909 as a bi-racial endeavor to advance justice for African Americans by a group including W. E. B. Du Bois, Mary White Ovington and Moorfield Storey.
Its mission in the 21st century is "to ensure the political, educational, social, and economic equality of rights of all persons and to eliminate race-based discrimination." National NAACP initiatives include political lobbying, publicity efforts and litigation strategies developed by its legal team.
The group enlarged its mission in the late 20th century by considering issues such as police misconduct, the status of black foreign refugees and questions of economic development. Its name, retained in accordance with tradition, uses the once common term colored people, referring to those with some African ancestry.
The NAACP bestows annual awards to African Americans in two categories: Image Awards are for achievement in the arts and entertainment, and Spingarn Medals are for outstanding achievement of any kind. Its headquarters is in Baltimore, Maryland.
Organization
The NAACP is headquartered in Baltimore, with additional regional offices in New York, Michigan, Georgia, Maryland, Texas, Colorado and California. Each regional office is responsible for coordinating the efforts of state conferences in that region. Local, youth, and college chapters organize activities for individual members.
In the U.S., the NAACP is administered by a 64-member board, led by a chairperson. The board elects one person as the president and one as chief executive officer for the organization. Julian Bond, Civil Rights Movement activist and former Georgia State Senator, was chairman until replaced in February 2010 by health-care administrator Roslyn Brock.
For decades in the first half of the 20th century, the organization was effectively led by its executive secretary, who acted as chief operating officer. James Weldon Johnson and Walter F. White, who served in that role successively from 1920 to 1958, were much more widely known as NAACP leaders than were presidents during those years.
The organization has never had a woman president, except on a temporary basis, and there have been calls to name one. Lorraine C. Miller served as interim president after Benjamin Jealous stepped down. Maya Wiley was rumored to be in line for the position in 2013, but Cornell William Brooks was selected.
Departments within the NAACP govern areas of action. Local chapters are supported by the 'Branch and Field Services' department and the 'Youth and College' department. The 'Legal' department focuses on court cases of broad application to minorities, such as systematic discrimination in employment, government, or education.
The Washington, D.C., bureau is responsible for lobbying the U.S. government, and the Education Department works to improve public education at the local, state and federal levels. The goal of the Health Division is to advance health care for minorities through public policy initiatives and education.
As of 2007, the NAACP had approximately 425,000 paying and non-paying members.
The NAACP's non-current records are housed at the Library of Congress, which has served as the organization's official repository since 1964. The records held there comprise approximately five million items spanning the NAACP's history from the time of its founding until 2003.
In 2011, the NAACP teamed with the digital repository ProQuest to digitize and host online the earlier portion of its archives, through 1972 – nearly two million pages of documents, from the national, legal, and branch offices throughout the country, which offer first-hand insight into the organization's work related to such crucial issues as lynching, school desegregation, and discrimination in all its aspects (in the military, the criminal justice system, employment, housing).
Click on any of the following blue hyperlinks for more about the NAACP:
Black Lives Matter (BLM) is an international activist movement, originating in the African-American community, that campaigns against violence and systemic racism towards black people.
BLM regularly holds protests speaking out against police killings of black people, and broader issues such as racial profiling, police brutality, and racial inequality in the United States criminal justice system.
In 2013, the movement began with the use of the hashtag #BlackLivesMatter on social media after the acquittal of George Zimmerman in the shooting death of African-American teen Trayvon Martin in February 2012.
Black Lives Matter became nationally recognized for its street demonstrations following the 2014 deaths of two African Americans: Michael Brown—resulting in protests and unrest in Ferguson, a city near St. Louis—and Eric Garner in New York City. Since the Ferguson protests, participants in the movement have demonstrated against the deaths of numerous other African Americans by police actions or while in police custody.
In the summer of 2015, Black Lives Matter activists became involved in the 2016 United States presidential election. The originators of the hashtag and call to action, Alicia Garza, Patrisse Cullors, and Opal Tometi, expanded their project into a national network of over 30 local chapters between 2014 and 2016. The overall Black Lives Matter movement, however, is a decentralized network and has no formal hierarchy.
There have been many reactions to the Black Lives Matter movement. The U.S. population's perception of Black Lives Matter varies considerably by race. The phrase "All Lives Matter" sprang up as a response to the Black Lives Matter movement, but has been criticized for dismissing or misunderstanding the message of "Black Lives Matter".
Following the shooting of two police officers in Ferguson, the hashtag Blue Lives Matter was created by supporters of the police. Some black civil rights leaders have disagreed with tactics used by Black Lives Matter activists.
Click on any of the following blue hyperlinks for more about "Black Lives Matter":
Its mission in the 21st century is "to ensure the political, educational, social, and economic equality of rights of all persons and to eliminate race-based discrimination." National NAACP initiatives include political lobbying, publicity efforts and litigation strategies developed by its legal team.
The group enlarged its mission in the late 20th century by considering issues such as police misconduct, the status of black foreign refugees and questions of economic development. Its name, retained in accordance with tradition, uses the once common term colored people, referring to those with some African ancestry.
The NAACP bestows annual awards to African Americans in two categories: Image Awards are for achievement in the arts and entertainment, and Spingarn Medals are for outstanding achievement of any kind. Its headquarters is in Baltimore, Maryland.
Organization
The NAACP is headquartered in Baltimore, with additional regional offices in New York, Michigan, Georgia, Maryland, Texas, Colorado and California. Each regional office is responsible for coordinating the efforts of state conferences in that region. Local, youth, and college chapters organize activities for individual members.
In the U.S., the NAACP is administered by a 64-member board, led by a chairperson. The board elects one person as the president and one as chief executive officer for the organization. Julian Bond, Civil Rights Movement activist and former Georgia State Senator, was chairman until replaced in February 2010 by health-care administrator Roslyn Brock.
For decades in the first half of the 20th century, the organization was effectively led by its executive secretary, who acted as chief operating officer. James Weldon Johnson and Walter F. White, who served in that role successively from 1920 to 1958, were much more widely known as NAACP leaders than were presidents during those years.
The organization has never had a woman president, except on a temporary basis, and there have been calls to name one. Lorraine C. Miller served as interim president after Benjamin Jealous stepped down. Maya Wiley was rumored to be in line for the position in 2013, but Cornell William Brooks was selected.
Departments within the NAACP govern areas of action. Local chapters are supported by the 'Branch and Field Services' department and the 'Youth and College' department. The 'Legal' department focuses on court cases of broad application to minorities, such as systematic discrimination in employment, government, or education.
The Washington, D.C., bureau is responsible for lobbying the U.S. government, and the Education Department works to improve public education at the local, state and federal levels. The goal of the Health Division is to advance health care for minorities through public policy initiatives and education.
As of 2007, the NAACP had approximately 425,000 paying and non-paying members.
The NAACP's non-current records are housed at the Library of Congress, which has served as the organization's official repository since 1964. The records held there comprise approximately five million items spanning the NAACP's history from the time of its founding until 2003.
In 2011, the NAACP teamed with the digital repository ProQuest to digitize and host online the earlier portion of its archives, through 1972 – nearly two million pages of documents, from the national, legal, and branch offices throughout the country, which offer first-hand insight into the organization's work related to such crucial issues as lynching, school desegregation, and discrimination in all its aspects (in the military, the criminal justice system, employment, housing).
Click on any of the following blue hyperlinks for more about the NAACP:
- Predecessor: The Niagara Movement
- History
- Geography
- Current activities
- Awards
- Partner organizations
- See also:
- Civil rights movement (1896–1954)
- Chicago Better Housing Association
- The Crisis, official magazine
- NAACP New Orleans Branch
- NAACP Theatre Awards
- NAACP Theatre Award – President's Award
- Niagara Movement
- Racial integration
- Overview of NAACP records at the Library of Congress, the official repository of the national organization
- NAACP branches database, including membership numbers and officer names. From the Mapping American Social Movements project at the University of Washington.
- Niagara Movement Du Bois Papers, Special Collections and University Archives, Umass Amherst
- National Association for the Advancement of Colored People, Region 1 Photograph Collection, ca. 1940–1982 at The Bancroft Library
- National Association for the Advancement of Colored People, Region I, Records, 1942–1986 (bulk 1945–1977) at The Bancroft Library
- National Association for the Advancement of Colored People, Vancouver Branch records. 1914–1967. 2.10 cubic feet (5 boxes). At the Labor Archives of Washington, University of Washington Libraries Special Collections
- NAACP Convention in Atlanta, Civil Rights Digital Library.
Black Lives Matter (BLM) is an international activist movement, originating in the African-American community, that campaigns against violence and systemic racism towards black people.
BLM regularly holds protests speaking out against police killings of black people, and broader issues such as racial profiling, police brutality, and racial inequality in the United States criminal justice system.
In 2013, the movement began with the use of the hashtag #BlackLivesMatter on social media after the acquittal of George Zimmerman in the shooting death of African-American teen Trayvon Martin in February 2012.
Black Lives Matter became nationally recognized for its street demonstrations following the 2014 deaths of two African Americans: Michael Brown—resulting in protests and unrest in Ferguson, a city near St. Louis—and Eric Garner in New York City. Since the Ferguson protests, participants in the movement have demonstrated against the deaths of numerous other African Americans by police actions or while in police custody.
In the summer of 2015, Black Lives Matter activists became involved in the 2016 United States presidential election. The originators of the hashtag and call to action, Alicia Garza, Patrisse Cullors, and Opal Tometi, expanded their project into a national network of over 30 local chapters between 2014 and 2016. The overall Black Lives Matter movement, however, is a decentralized network and has no formal hierarchy.
There have been many reactions to the Black Lives Matter movement. The U.S. population's perception of Black Lives Matter varies considerably by race. The phrase "All Lives Matter" sprang up as a response to the Black Lives Matter movement, but has been criticized for dismissing or misunderstanding the message of "Black Lives Matter".
Following the shooting of two police officers in Ferguson, the hashtag Blue Lives Matter was created by supporters of the police. Some black civil rights leaders have disagreed with tactics used by Black Lives Matter activists.
Click on any of the following blue hyperlinks for more about "Black Lives Matter":
- Founding
- Structure and organization
- Strategies and tactics
- Timeline of notable US events and demonstrations
- BLM international movement
- 2016 U.S. presidential election
- Counter-slogans and movements
- Criticism of "Black Lives Matter"
- Influence
- See also:
- Black Identity Extremists
- Black Twitter
- De-escalation#United States of America
- H.R. 40 - Commission to Study and Develop Reparation Proposals for African-Americans Act
- Racism in the United States
- Reparations for slavery
- Say Her Name
- Taking a Stand in Baton Rouge
- The Hate U Give (novel and film)
- The personal is political
- Woke
- Official website
- List of 1007 Black Lives Matter demonstrations
- Campaign Zero to end police violence
- "Read This: #BlackLivesMatter Reads for Teens". Minnesota: Hennepin County Library. Archived from the original on September 21, 2016. (Bibliography)
- "#blacklivesmatter". American Library Association, Young Adult Library Services Association. (Bibliography)
American Civil Liberties Union (ACLU)
- YouTube Video: What is the History of ACLU
- YouTube Video: WE HAVE RIGHTS: When ICE Is Outside Our Doors (English)
- YouTube Video by Nadine Strossen: What does the A.C.L.U. stand for?
The American Civil Liberties Union (ACLU) is a nonprofit organization whose stated mission is "to defend and preserve the individual rights and liberties guaranteed to every person in this country by the Constitution and laws of the United States."
Officially nonpartisan, the organization has been supported and criticized by liberal and conservative organizations alike. The ACLU works through litigation and lobbying and it has over 1,200,000 members and an annual budget of over $100 million.
Local affiliates of the ACLU are active in all 50 states, the District of Columbia, and Puerto Rico.
The ACLU provides legal assistance in cases when it considers civil liberties to be at risk. Legal support from the ACLU can take the form of direct legal representation or preparation of amicus curiaebriefs expressing legal arguments when another law firm is already providing representation.
In addition to representing persons and organizations in lawsuits, the ACLU lobbies for policy positions that have been established by its board of directors. Current positions of the ACLU include:
Legally, the ACLU consists of two separate but closely affiliated nonprofit organizations: the American Civil Liberties Union, a 501(c)(4) social welfare group, and the ACLU Foundation, a 501(c)(3) public charity.
Both organizations engage in civil rights litigation, advocacy, and education, but only donations to the 501(c)(3) foundation are tax deductible, and only the 501(c)(4) group can engage in unlimited political lobbying. The two organizations share office space and employees.
Overview:
The ACLU was founded in 1920 by a committee including the following:
ACLU's focus was on freedom of speech, primarily for anti-war protesters. During the 1920s, the ACLU expanded its scope to include protecting the free speech rights of artists and striking workers, and working with the National Association for the Advancement of Colored People (NAACP) to decrease racism and discrimination.
During the 1930s, the ACLU started to engage in work combating police misconduct and supporting Native American rights. Many of the ACLU's cases involved the defense of Communist Party members and Jehovah's Witnesses.
In 1940, the ACLU leadership voted to exclude communists from its leadership positions, a decision rescinded in 1968.
During World War II, the ACLU defended Japanese-American citizens, unsuccessfully trying to prevent their forcible relocation to internment camps. During the Cold War, the ACLU headquarters was dominated by anti-communists, but many local affiliates defended members of the Communist Party.
By 1964, membership had risen to 80,000, and the ACLU participated in efforts to expand civil liberties. In the 1960s, the ACLU continued its decades-long effort to enforce separation of church and state. It defended several anti-war activists during the Vietnam War.
The ACLU was involved in the Miranda case, which addressed conduct by police during interrogations, and in the New York Times case, which established new protections for newspapers reporting on government activities. In the 1970s and 1980s, the ACLU ventured into new legal areas, involving the rights of homosexuals, students, prisoners, and the poor.
In the twenty-first century, the ACLU has fought the teaching of creationism in public schools and challenged some provisions of anti-terrorism legislation as infringing on privacy and civil liberties. Fundraising and membership spiked after the 2016 election; the ACLU's current membership is more than 1.2 million.
Organization:
Leadership:
The ACLU is led by a president and an executive director, Susan N. Herman and Anthony Romero, respectively, in 2015.
The president acts as chair of the ACLU's board of directors, leads fundraising, and facilitates policy-setting.
The executive director manages the day-to-day operations of the organization.
The board of directors consists of 80 persons, including representatives from each state affiliate, as well as at-large delegates. The organization has its headquarters in 125 Broad Street, a 40-story skyscraper located in Lower Manhattan, New York City.
The leadership of the ACLU does not always agree on policy decisions; differences of opinion within the ACLU leadership have sometimes grown into major debates. In 1937, an internal debate erupted over whether to defend Henry Ford's right to distribute anti-union literature.
In 1939, a heated debate took place over whether to prohibit communists from serving in ACLU leadership roles. During the early 1950s and Cold War McCarthyism, the board was divided on whether to defend communists.
In 1968, a schism formed over whether to represent Benjamin Spock's anti-war activism.
In 1973, there was internal conflict over whether to call for the impeachment of Richard Nixon.
In 2005, there was internal conflict about whether or not a gag rule should be imposed on ACLU employees to prevent publication of internal disputes.
Funding:
In the year ending March 31, 2014, the ACLU and the ACLU Foundation had a combined income from support and revenue of $100.4 million, originating from grants (50.0%), membership donations (25.4%), donated legal services (7.6%), bequests (16.2%), and revenue (0.9%).
Membership dues are treated as donations; members choose the amount they pay annually, averaging approximately $50 per member per year.
In the year ending March 31, 2014, the combined expenses of the ACLU and ACLU Foundation were $133.4 million, spent on programs (86.2%), management (7.4%), and fundraising (8.2%). (After factoring in other changes in net assets of +$30.9 million, from sources such as investment income, the organization had an overall decrease in net assets of $2.1 million.)
Over the period from 2011 to 2014 the ACLU Foundation, on the average, has accounted for roughly 70% of the combined budget, and the ACLU roughly 30%.
The ACLU solicits donations to its charitable foundation. The ACLU is accredited by the Better Business Bureau, and the Charity Navigator has ranked the ACLU with a four-star rating.
The local affiliates solicit their own funding; however, some also receive funds from the national ACLU, with the distribution and amount of such assistance varying from state to state.
At its discretion, the national organization provides subsidies to smaller affiliates that lack sufficient resources to be self-sustaining; for example, the Wyoming ACLU chapter received such subsidies until April 2015, when, as part of a round of layoffs at the national ACLU, the Wyoming office was closed.
In October 2004, the ACLU rejected $1.5 million from both the Ford Foundation and Rockefeller Foundation because the foundations had adopted language from the USA PATRIOT Act in their donation agreements, including a clause stipulating that none of the money would go to "underwriting terrorism or other unacceptable activities." The ACLU views this clause, both in federal law and in the donors' agreements, as a threat to civil liberties, saying it is overly broad and ambiguous.
Due to the nature of its legal work, the ACLU is often involved in litigation against governmental bodies, which are generally protected from adverse monetary judgments; a town, state or federal agency may be required to change its laws or behave differently, but not to pay monetary damages except by an explicit statutory waiver.
In some cases, the law permits plaintiffs who successfully sue government agencies to collect money damages or other monetary relief. In particular, the Civil Rights Attorney's Fees Award Act of 1976 leaves the government liable in some civil rights cases. Fee awards under this civil rights statute are considered "equitable relief" rather than damages, and government entities are not immune from equitable relief.
Under laws such as this, the ACLU and its state affiliates sometimes share in monetary judgments against government agencies. In 2006, the Public Expressions of Religion Protection Act sought to prevent monetary judgments in the particular case of violations of church-state separation.
The ACLU has received court awarded fees from opponents, for example, the Georgia affiliate was awarded $150,000 in fees after suing a county demanding the removal of a Ten Commandments display from its courthouse; a second Ten Commandments case in the state, in a different county, led to a $74,462 judgment. The State of Tennessee was required to pay $50,000, the State of Alabama $175,000, and the State of Kentucky $121,500, in similar Ten Commandments cases.
State Affiliates:
Most of the organization's workload is performed by its local affiliates. There is at least one affiliate organization in each state, as well as one in Washington, DC, and in Puerto Rico. California has three affiliates.
The affiliates operate autonomously from the national organization; each affiliate has its own staff, executive director, board of directors, and budget. Each affiliate consists of two non-profit corporations: a 501(c)(3) corporation that does not perform lobbying, and a 501(c)(4) corporation which is entitled to lobby.
ACLU affiliates are the basic unit of the ACLU's organization and engage in litigation, lobbying, and public education. For example, in a twenty-month period beginning January 2004, the ACLU's New Jersey chapter was involved in fifty-one cases according to their annual report—thirty-five cases in state courts, and sixteen in federal court.
They provided legal representation in thirty-three of those cases, and served as amicus in the remaining eighteen. They listed forty-four volunteer attorneys who assisted them in those cases.
Positions:
The ACLU's official position statements, as of January 2012, included the following policies:
Support and opposition:
The ACLU is supported by a variety of persons and organizations. There were over 1,000,000 members in 2017, and the ACLU annually receives thousands of grants from hundreds of charitable foundations.
Allies of the ACLU in legal actions have included:
The ACLU has been criticized by liberals, such as when it excluded communists from its leadership ranks, when it defended Neo-Nazis, when it declined to defend Paul Robeson, or when it opposed the passage of the National Labor Relations Act.
Conversely, it has been criticized by conservatives, such as when it argued against official prayer in public schools, or when it opposed the Patriot Act. The ACLU has supported conservative figures such as Rush Limbaugh, George Wallace, Henry Ford, and Oliver North; and it has supported liberal figures such as Dick Gregory, Rockwell Kent, and Benjamin Spock.
A major source of criticism are legal cases in which the ACLU represents an individual or organization that promotes offensive or unpopular viewpoints, such a:s
The ACLU responded to these criticisms by stating "It is easy to defend freedom of speech when the message is something many people find at least reasonable. But the defense of freedom of speech is most critical when the message is one most people find repulsive."
Click on any of the following blue hyperlinks for more about the ACLU:
Officially nonpartisan, the organization has been supported and criticized by liberal and conservative organizations alike. The ACLU works through litigation and lobbying and it has over 1,200,000 members and an annual budget of over $100 million.
Local affiliates of the ACLU are active in all 50 states, the District of Columbia, and Puerto Rico.
The ACLU provides legal assistance in cases when it considers civil liberties to be at risk. Legal support from the ACLU can take the form of direct legal representation or preparation of amicus curiaebriefs expressing legal arguments when another law firm is already providing representation.
In addition to representing persons and organizations in lawsuits, the ACLU lobbies for policy positions that have been established by its board of directors. Current positions of the ACLU include:
- opposing the death penalty;
- supporting same-sex marriage and the right of LGBT people to adopt;
- supporting birth control and abortion rights;
- eliminating discrimination against women, minorities, and LGBT people;
- supporting the rights of prisoners and opposing torture; and opposing government preference for religion over non-religion, or for particular faiths over others.
Legally, the ACLU consists of two separate but closely affiliated nonprofit organizations: the American Civil Liberties Union, a 501(c)(4) social welfare group, and the ACLU Foundation, a 501(c)(3) public charity.
Both organizations engage in civil rights litigation, advocacy, and education, but only donations to the 501(c)(3) foundation are tax deductible, and only the 501(c)(4) group can engage in unlimited political lobbying. The two organizations share office space and employees.
Overview:
The ACLU was founded in 1920 by a committee including the following:
- Helen Keller,
- Roger Baldwin,
- Crystal Eastman,
- Walter Nelles,
- Morris Ernst,
- Albert DeSilver,
- Arthur Garfield Hays,
- Jane Addams,
- Felix Frankfurter,
- Elizabeth Gurley Flynn,
- and Rose Schneiderman.
ACLU's focus was on freedom of speech, primarily for anti-war protesters. During the 1920s, the ACLU expanded its scope to include protecting the free speech rights of artists and striking workers, and working with the National Association for the Advancement of Colored People (NAACP) to decrease racism and discrimination.
During the 1930s, the ACLU started to engage in work combating police misconduct and supporting Native American rights. Many of the ACLU's cases involved the defense of Communist Party members and Jehovah's Witnesses.
In 1940, the ACLU leadership voted to exclude communists from its leadership positions, a decision rescinded in 1968.
During World War II, the ACLU defended Japanese-American citizens, unsuccessfully trying to prevent their forcible relocation to internment camps. During the Cold War, the ACLU headquarters was dominated by anti-communists, but many local affiliates defended members of the Communist Party.
By 1964, membership had risen to 80,000, and the ACLU participated in efforts to expand civil liberties. In the 1960s, the ACLU continued its decades-long effort to enforce separation of church and state. It defended several anti-war activists during the Vietnam War.
The ACLU was involved in the Miranda case, which addressed conduct by police during interrogations, and in the New York Times case, which established new protections for newspapers reporting on government activities. In the 1970s and 1980s, the ACLU ventured into new legal areas, involving the rights of homosexuals, students, prisoners, and the poor.
In the twenty-first century, the ACLU has fought the teaching of creationism in public schools and challenged some provisions of anti-terrorism legislation as infringing on privacy and civil liberties. Fundraising and membership spiked after the 2016 election; the ACLU's current membership is more than 1.2 million.
Organization:
Leadership:
The ACLU is led by a president and an executive director, Susan N. Herman and Anthony Romero, respectively, in 2015.
The president acts as chair of the ACLU's board of directors, leads fundraising, and facilitates policy-setting.
The executive director manages the day-to-day operations of the organization.
The board of directors consists of 80 persons, including representatives from each state affiliate, as well as at-large delegates. The organization has its headquarters in 125 Broad Street, a 40-story skyscraper located in Lower Manhattan, New York City.
The leadership of the ACLU does not always agree on policy decisions; differences of opinion within the ACLU leadership have sometimes grown into major debates. In 1937, an internal debate erupted over whether to defend Henry Ford's right to distribute anti-union literature.
In 1939, a heated debate took place over whether to prohibit communists from serving in ACLU leadership roles. During the early 1950s and Cold War McCarthyism, the board was divided on whether to defend communists.
In 1968, a schism formed over whether to represent Benjamin Spock's anti-war activism.
In 1973, there was internal conflict over whether to call for the impeachment of Richard Nixon.
In 2005, there was internal conflict about whether or not a gag rule should be imposed on ACLU employees to prevent publication of internal disputes.
Funding:
In the year ending March 31, 2014, the ACLU and the ACLU Foundation had a combined income from support and revenue of $100.4 million, originating from grants (50.0%), membership donations (25.4%), donated legal services (7.6%), bequests (16.2%), and revenue (0.9%).
Membership dues are treated as donations; members choose the amount they pay annually, averaging approximately $50 per member per year.
In the year ending March 31, 2014, the combined expenses of the ACLU and ACLU Foundation were $133.4 million, spent on programs (86.2%), management (7.4%), and fundraising (8.2%). (After factoring in other changes in net assets of +$30.9 million, from sources such as investment income, the organization had an overall decrease in net assets of $2.1 million.)
Over the period from 2011 to 2014 the ACLU Foundation, on the average, has accounted for roughly 70% of the combined budget, and the ACLU roughly 30%.
The ACLU solicits donations to its charitable foundation. The ACLU is accredited by the Better Business Bureau, and the Charity Navigator has ranked the ACLU with a four-star rating.
The local affiliates solicit their own funding; however, some also receive funds from the national ACLU, with the distribution and amount of such assistance varying from state to state.
At its discretion, the national organization provides subsidies to smaller affiliates that lack sufficient resources to be self-sustaining; for example, the Wyoming ACLU chapter received such subsidies until April 2015, when, as part of a round of layoffs at the national ACLU, the Wyoming office was closed.
In October 2004, the ACLU rejected $1.5 million from both the Ford Foundation and Rockefeller Foundation because the foundations had adopted language from the USA PATRIOT Act in their donation agreements, including a clause stipulating that none of the money would go to "underwriting terrorism or other unacceptable activities." The ACLU views this clause, both in federal law and in the donors' agreements, as a threat to civil liberties, saying it is overly broad and ambiguous.
Due to the nature of its legal work, the ACLU is often involved in litigation against governmental bodies, which are generally protected from adverse monetary judgments; a town, state or federal agency may be required to change its laws or behave differently, but not to pay monetary damages except by an explicit statutory waiver.
In some cases, the law permits plaintiffs who successfully sue government agencies to collect money damages or other monetary relief. In particular, the Civil Rights Attorney's Fees Award Act of 1976 leaves the government liable in some civil rights cases. Fee awards under this civil rights statute are considered "equitable relief" rather than damages, and government entities are not immune from equitable relief.
Under laws such as this, the ACLU and its state affiliates sometimes share in monetary judgments against government agencies. In 2006, the Public Expressions of Religion Protection Act sought to prevent monetary judgments in the particular case of violations of church-state separation.
The ACLU has received court awarded fees from opponents, for example, the Georgia affiliate was awarded $150,000 in fees after suing a county demanding the removal of a Ten Commandments display from its courthouse; a second Ten Commandments case in the state, in a different county, led to a $74,462 judgment. The State of Tennessee was required to pay $50,000, the State of Alabama $175,000, and the State of Kentucky $121,500, in similar Ten Commandments cases.
State Affiliates:
Most of the organization's workload is performed by its local affiliates. There is at least one affiliate organization in each state, as well as one in Washington, DC, and in Puerto Rico. California has three affiliates.
The affiliates operate autonomously from the national organization; each affiliate has its own staff, executive director, board of directors, and budget. Each affiliate consists of two non-profit corporations: a 501(c)(3) corporation that does not perform lobbying, and a 501(c)(4) corporation which is entitled to lobby.
ACLU affiliates are the basic unit of the ACLU's organization and engage in litigation, lobbying, and public education. For example, in a twenty-month period beginning January 2004, the ACLU's New Jersey chapter was involved in fifty-one cases according to their annual report—thirty-five cases in state courts, and sixteen in federal court.
They provided legal representation in thirty-three of those cases, and served as amicus in the remaining eighteen. They listed forty-four volunteer attorneys who assisted them in those cases.
Positions:
The ACLU's official position statements, as of January 2012, included the following policies:
- Affirmative action – The ACLU supports affirmative action.
- Birth control and abortion – The ACLU supports the right to abortion, as established in the Roe v. Wade decision. The ACLU believes that everyone should have affordable access to the full range of contraceptive options. The ACLU's Reproductive Freedom Project manages efforts related to reproductive rights.
- Campaign funding – The ACLU believes that the current system is badly flawed, and supports a system based on public funding. The ACLU supports full transparency to identify donors. However, the ACLU opposes attempts to control political spending. The ACLU supported the Supreme Court's decision in Citizens United v. FEC, which allowed corporations and unions more political speech rights.
- Child pornography – The Arizona chapter of the ACLU believes that production of child pornography should be illegal, but that possessing it is protected by the right to privacy. "Our policy is that possessing even pornographic material about children should not itself be a crime. The way to deal with this issue is to prosecute the makers of child pornography for exploiting minors."
- Criminal law reform – The ACLU seeks an end to what it feels are excessively harsh sentences that "stand in the way of a just and equal society". The ACLU's Criminal Law Reform Project focuses on this issue.
- Death penalty – The ACLU is opposed to the death penalty in all circumstances. The ACLU's Capital Punishment Project focuses on this issue.
- Free speech – The ACLU supports free speech, including the right to express unpopular or controversial ideas, such as flag desecration, racist or sexist views, etc. However, a leaked memo from June 2018 said that speech that can "inflict serious harms" and "impede progress toward equality" may be a lower priority for the organization.
- Gun rights – The national ACLU's position is that the Second Amendment protects a collective right to own guns rather than an individual right, despite the 2008 Supreme Court decision in District of Columbia v. Heller that the Second Amendment is an individual right. The national organization's position is based on the phrases "a well regulated Militia" and "the security of a free State". However, the ACLU opposes any effort to create a registry of gun owners and has worked with the National Rifle Association to prevent a registry from being created, and it has favored protecting the right to carry guns under the 4th Amendment.
- HIV/AIDS – The policy of the ACLU is to "create a world in which discrimination based on HIV status has ended, people with HIV have control over their medical information and care, and where the government's HIV policy promotes public health and respect and compassion for people living with HIV and AIDS." This effort is managed by the ACLU's AIDS Project.
- Human rights – The ACLU's Human Rights project advocates (primarily in an international context) for children's rights, immigrants rights, gay rights, and other international obligations.
- Immigrants' rights – The ACLU supports civil liberties for immigrants to the United States.
- Lesbian, gay, bisexual and transgender rights – The ACLU's LGBT Rights Project supports equal rights for all gays and lesbians, and works to eliminate discrimination. The ACLU supports equal employment, housing, civil marriage and adoption rights for LGBT couples.
- National security – The ACLU is opposed to compromising civil liberties in the name of national security. In this context, the ACLU has condemned government use of spying, indefinite detention without charge or trial, and government-sponsored torture. This effort is led by the ACLU's National Security Project.
- Prisoners' rights – The ACLU's National Prison Project believes that incarceration should only be used as a last resort, and that prisons should focus on rehabilitation. The ACLU works to ensure that prisons treat prisoners in accordance with the Constitution and domestic law.
- Privacy and technology – The ACLU's Project on Speech, Privacy, and Technology promotes "responsible uses of technology that enhance privacy protection", and opposes uses "that undermine our freedoms and move us closer to a surveillance society".
- Racial issues – The ACLU's Racial Justice Program combats racial discrimination in all aspects of society, including the educational system, justice system, and the application of the death penalty. However, the ACLU opposes state censorship of the Confederate flag.
- Religion – The ACLU supports the right of religious persons to practice their faiths without government interference. The ACLU believes the government should neither prefer religion over non-religion, nor favor particular faiths over others. The ACLU is opposed to school-led prayer, but protects students' right to pray in school. It opposes the use of religious beliefs to discriminate, such as refusing to provide abortion coverage or providing services to LGBT people.
- Single sex public education – The ACLU opposes single sex public education options. It believes that single-sex education contributes to gender stereotyping and compares single-sex education to racial segregation.
- Voting rights – The ACLU believes that impediments to voting should be eliminated, particularly if they disproportionately impact minority or poor citizens. The ACLU believes that misdemeanor convictions should not lead to a loss of voting rights. The ACLU's Voting Rights Project leads this effort.
- Women's rights – The ACLU works to eliminate discrimination against women in all realms. The ACLU encourages government to be proactive in stopping violence against women. These efforts are led by the ACLU's Women's Rights project.
Support and opposition:
The ACLU is supported by a variety of persons and organizations. There were over 1,000,000 members in 2017, and the ACLU annually receives thousands of grants from hundreds of charitable foundations.
Allies of the ACLU in legal actions have included:
- the National Association for the Advancement of Colored People,
- the American Jewish Congress,
- People For the American Way,
- the National Rifle Association,
- the Electronic Frontier Foundation,
- Americans United for Separation of Church and State,
- and the National Organization for Women.
The ACLU has been criticized by liberals, such as when it excluded communists from its leadership ranks, when it defended Neo-Nazis, when it declined to defend Paul Robeson, or when it opposed the passage of the National Labor Relations Act.
Conversely, it has been criticized by conservatives, such as when it argued against official prayer in public schools, or when it opposed the Patriot Act. The ACLU has supported conservative figures such as Rush Limbaugh, George Wallace, Henry Ford, and Oliver North; and it has supported liberal figures such as Dick Gregory, Rockwell Kent, and Benjamin Spock.
A major source of criticism are legal cases in which the ACLU represents an individual or organization that promotes offensive or unpopular viewpoints, such a:s
- the Ku Klux Klan,
- Neo-Nazis,
- Nation of Islam,
- North American Man/Boy Love Association,
- Westboro Baptist Church
- or Unite the Right rally.
The ACLU responded to these criticisms by stating "It is easy to defend freedom of speech when the message is something many people find at least reasonable. But the defense of freedom of speech is most critical when the message is one most people find repulsive."
Click on any of the following blue hyperlinks for more about the ACLU:
- Early years
- 1930s
- Mid-century
- Post-Cold War era
- See also:
- American Civil Rights Union
- Institute for Justice
- List of court cases involving the American Civil Liberties Union
- National Emergency Civil Liberties Committee
- New York Civil Liberties Union
- Political freedom
- Official website
- American Civil Liberties Union Records, Princeton University. Document archive 1917–1950, including the history of the ACLU.
- Debs Pamphlet Collection, Indiana State University Library. An array of annual ACLU reports in PDF.
- List of 100 most important ACLU victories, New Hampshire Civil Liberties Union.
- De-classified FBI records on the ACLU
Tax Havens and Offshore Banks
- YouTube Video: The Bizarre Economics of Tax Havens and Pirate Banking: James S. Henry at TEDxRadboudU 2013
- YouTube Video: Where in the world is it easiest to get rich? | Harald Eia | TEDxOslo
- YouTube Video: Tax avoidance: a necessary evil? | Alexandre Stylianoudis | TEDxUniversityofKent
A tax haven is defined as a country or place with very low "effective" rates of taxation for foreign investors ("headline" rates may be higher).
In some traditional definitions, a tax haven also offers financial secrecy. However, while countries with high levels of secrecy but also high rates of taxation (e.g. the United States and Germany in the Financial Secrecy Index ("FSI") rankings), can feature in some tax haven lists, they are not universally considered as tax havens.
In contrast, countries with lower levels of secrecy but also low "effective" rates of taxation (e.g. Ireland in the FSI rankings), appear in most § Tax haven lists. The consensus around effective tax rates has led academics to note that the term "tax haven" and "offshore financial centre" are almost synonymous.
Traditional tax havens, like Jersey, are open about zero rates of taxation, but as a consequence have limited bilateral tax treaties. Modern corporate tax havens
have non-zero "headline" rates of taxation and high levels of OECD–compliance, and thus have large networks of bilateral tax treaties.
However, their base erosion and profit shifting ("BEPS") tools enable corporations to achieve "effective" tax rates closer to zero, not just in the haven but in all countries with which the haven has tax treaties; putting them on tax haven lists.
According to modern studies, the § Top 10 tax havens include corporate-focused havens like the Netherlands, Singapore, Ireland and the U.K., while Luxembourg, Hong Kong, the Caribbean (the Caymans, Bermuda, and the British Virgin Islands) and Switzerland feature as both major traditional tax havens and major corporate tax havens. Corporate tax havens often serve as "conduits" to traditional tax havens.
Use of tax havens results in a loss of tax revenues to countries which are not tax havens. Estimates of the § Financial scale of taxes avoided vary, but the most credible have a range of US$100–250 billion per annum. In addition, capital held in tax havens can permanently leave the tax base (base erosion).
Estimates of capital held in tax havens also vary: the most credible estimates are between US$7–10 trillion (up to 10% of global assets). The harm of traditional and corporate tax havens has been particularly noted in developing nations, where the tax revenues are needed to build infrastructure.
Over 15% of countries are sometimes labelled tax havens. Tax havens are mostly successful and well-governed economies, and being a haven has brought prosperity. The top 10–15 GDP-per-capita countries, excluding oil and gas exporters, are tax havens.
Because of § Inflated GDP-per-capita (due to accounting BEPS flows), havens are prone to over-leverage (international capital misprice the artificial debt-to-GDP). This can lead to severe credit cycles and/or property/banking crises when international capital flows are repriced.
Ireland's Celtic Tiger, and the subsequent financial crisis in 2009–13, is an example. Jersey is another. Research shows § U.S. as the largest beneficiary, and use of tax havens by U.S corporations maximized long-term U.S. exchequer receipts.
The focus on combating tax havens (e.g. OECD–IMF projects) has been on common standards, transparency and data sharing. The rise of OECD-compliant corporate tax havens, whose BEPS tools are responsible for most of the lost taxes, has led to criticism of this approach, versus actual taxes paid.
Higher-tax jurisdictions, such as the United States and many member states of the European Union, departed from the OECD BEPS Project in 2017–18, to introduce anti-BEPS tax regimes, targeted raising net taxes paid by corporations in corporate tax havens (e.g. the U.S. Tax Cuts and Jobs Act of 2017 ("TCJA") GILTI–BEAT–FDII tax regimes and move to a hybrid "territorial" tax system, and proposed EU Digital Services Tax regime, and EU Common Consolidated Corporate Tax Base).
Click on any of the following blue hyperlinks for more about Tax Havens:
An offshore bank is a bank regulated under international banking license (often called offshore license), which usually prohibits the bank from establishing any business activities in the jurisdiction of establishment.
Due to less regulation and transparency, accounts with offshore banks were often used to hide undeclared income. Since the 1980s, jurisdictions that provide financial services to nonresidents on a big scale, can be referred to as offshore financial centres. Since OFCs often also levy little or no tax corporate and/or personal income and offer, they are often referred to as tax havens.
With worldwide increasing measures on CFT (combatting the financing of terrorism) and AML (anti-money laundering) compliance, the offshore banking sector in most jurisdictions was subject to changing regulations. Since 2000 the Financial Action Task Force issues the so-called FATF blacklist of "Non-Cooperative Countries or Territories" (NCCTs), which it perceived to be non-cooperative in the global fight against money laundering and terrorist financing.
An account held in a foreign offshore bank, is often described as an offshore account. Typically, an individual or company will maintain an offshore account for the financial and legal advantages it provides, including:
While the term originates from the Channel Islands being "offshore" from the United Kingdom, and while most offshore banks are located in island nations to this day, the term is used figuratively to refer to any bank used for these advantages, regardless of location. Thus, some banks in landlocked Andorra, Luxembourg, and Switzerland may be described as "offshore banks".
Offshore banking has often been associated with the underground economy and organized crime, tax evasion and money laundering; however, legally, offshore banking does not prevent assets from being subject to personal income tax on interest. Except for certain people who meet fairly complex requirements (such as perpetual travelers), the personal income tax laws of many countries (e.g., France, Malaysia, and the United States) make no distinction between interest earned in local banks and that earned abroad.
Persons subject to US income tax, for example, are required to declare, on penalty of perjury, any foreign bank accounts—which may or may not be numbered bank accounts—they may have. Although offshore banks may decide not to report income to other tax authorities and have no legal obligation to do so, as they are protected by bank secrecy, this does not make the non-declaration of the income by the taxpayer or the evasion of the tax on that income legal.
Following the 9/11 attacks, there have been many calls to increase regulation on international finance, in particular concerning offshore banks, tax havens, and clearing houses such as Clearstream, based in Luxembourg, which are possible crossroads for major illegal money flows.
Click on any of the following blue hyperlinks for more about Offshore Banks:
In some traditional definitions, a tax haven also offers financial secrecy. However, while countries with high levels of secrecy but also high rates of taxation (e.g. the United States and Germany in the Financial Secrecy Index ("FSI") rankings), can feature in some tax haven lists, they are not universally considered as tax havens.
In contrast, countries with lower levels of secrecy but also low "effective" rates of taxation (e.g. Ireland in the FSI rankings), appear in most § Tax haven lists. The consensus around effective tax rates has led academics to note that the term "tax haven" and "offshore financial centre" are almost synonymous.
Traditional tax havens, like Jersey, are open about zero rates of taxation, but as a consequence have limited bilateral tax treaties. Modern corporate tax havens
have non-zero "headline" rates of taxation and high levels of OECD–compliance, and thus have large networks of bilateral tax treaties.
However, their base erosion and profit shifting ("BEPS") tools enable corporations to achieve "effective" tax rates closer to zero, not just in the haven but in all countries with which the haven has tax treaties; putting them on tax haven lists.
According to modern studies, the § Top 10 tax havens include corporate-focused havens like the Netherlands, Singapore, Ireland and the U.K., while Luxembourg, Hong Kong, the Caribbean (the Caymans, Bermuda, and the British Virgin Islands) and Switzerland feature as both major traditional tax havens and major corporate tax havens. Corporate tax havens often serve as "conduits" to traditional tax havens.
Use of tax havens results in a loss of tax revenues to countries which are not tax havens. Estimates of the § Financial scale of taxes avoided vary, but the most credible have a range of US$100–250 billion per annum. In addition, capital held in tax havens can permanently leave the tax base (base erosion).
Estimates of capital held in tax havens also vary: the most credible estimates are between US$7–10 trillion (up to 10% of global assets). The harm of traditional and corporate tax havens has been particularly noted in developing nations, where the tax revenues are needed to build infrastructure.
Over 15% of countries are sometimes labelled tax havens. Tax havens are mostly successful and well-governed economies, and being a haven has brought prosperity. The top 10–15 GDP-per-capita countries, excluding oil and gas exporters, are tax havens.
Because of § Inflated GDP-per-capita (due to accounting BEPS flows), havens are prone to over-leverage (international capital misprice the artificial debt-to-GDP). This can lead to severe credit cycles and/or property/banking crises when international capital flows are repriced.
Ireland's Celtic Tiger, and the subsequent financial crisis in 2009–13, is an example. Jersey is another. Research shows § U.S. as the largest beneficiary, and use of tax havens by U.S corporations maximized long-term U.S. exchequer receipts.
The focus on combating tax havens (e.g. OECD–IMF projects) has been on common standards, transparency and data sharing. The rise of OECD-compliant corporate tax havens, whose BEPS tools are responsible for most of the lost taxes, has led to criticism of this approach, versus actual taxes paid.
Higher-tax jurisdictions, such as the United States and many member states of the European Union, departed from the OECD BEPS Project in 2017–18, to introduce anti-BEPS tax regimes, targeted raising net taxes paid by corporations in corporate tax havens (e.g. the U.S. Tax Cuts and Jobs Act of 2017 ("TCJA") GILTI–BEAT–FDII tax regimes and move to a hybrid "territorial" tax system, and proposed EU Digital Services Tax regime, and EU Common Consolidated Corporate Tax Base).
Click on any of the following blue hyperlinks for more about Tax Havens:
- Definitions
- Groupings
- Lists
- Scale
- Incentives
- Benefits
- Concepts
- Data leaks
- Countermeasures
- History
- See also:
- Asset protection
- Association for the Taxation of Financial Transactions and for Citizens' Action
- Bank secrecy
- Conduit and Sink OFCs
- Corporate haven
- Corporate inversion
- Flag of convenience
- Foreign Account Tax Compliance Act
- Free port
- Free economic zone
- Global Forum on Transparency and Exchange of Information for Tax Purposes
- International business company
- International taxation
- Ireland as a tax haven
- List of foundations established in Vaduz
- List of countries by tax rates
- Luxembourg leaks
- Offshore company
- Offshore financial centre
- Offshore trust
- Panama as a tax haven
- Panama Papers
- Paradise Papers
- Pirate haven
- Tax noncompliance
- Tax exile
- Tax exporting
- Tax hell
- Tax Justice Network
- Tax shelter
- United States as a tax haven
- Vulture fund
- International Financial Centres Forum (IFC Forum)
- IMF – Offshore Banking and Financial Centers
- Offshore Financial Centers – IMF Background Paper
- Global Forum on Transparency and Exchange of Information for Tax Purposes, OECD
- Global Financial Integrity
- Task Force on Financial Integrity & Economic Development
- An OECD Proposal To Eliminate Tax Competition Would Mean Higher Taxes and Less Privacy – Heritage Foundation: Washington D.C.
- The Economic Case for Tax Havens
- "Why tax havens are a blessing" – the Cato Institute
- "Profiting from corruption: The role and responsibility of financial institutions" – U4 Anti-Corruption Resource Centre
- Tax Havens • Explained With Maps (Documentary)
An offshore bank is a bank regulated under international banking license (often called offshore license), which usually prohibits the bank from establishing any business activities in the jurisdiction of establishment.
Due to less regulation and transparency, accounts with offshore banks were often used to hide undeclared income. Since the 1980s, jurisdictions that provide financial services to nonresidents on a big scale, can be referred to as offshore financial centres. Since OFCs often also levy little or no tax corporate and/or personal income and offer, they are often referred to as tax havens.
With worldwide increasing measures on CFT (combatting the financing of terrorism) and AML (anti-money laundering) compliance, the offshore banking sector in most jurisdictions was subject to changing regulations. Since 2000 the Financial Action Task Force issues the so-called FATF blacklist of "Non-Cooperative Countries or Territories" (NCCTs), which it perceived to be non-cooperative in the global fight against money laundering and terrorist financing.
An account held in a foreign offshore bank, is often described as an offshore account. Typically, an individual or company will maintain an offshore account for the financial and legal advantages it provides, including:
- Greater privacy (see also bank secrecy, a principle born with the 1934 Swiss Banking Act)
- Little or no taxation (i.e., tax havens per above)
- Easy access to deposits (at least in terms of regulation)
- Protection against local, political, or financial instability.
While the term originates from the Channel Islands being "offshore" from the United Kingdom, and while most offshore banks are located in island nations to this day, the term is used figuratively to refer to any bank used for these advantages, regardless of location. Thus, some banks in landlocked Andorra, Luxembourg, and Switzerland may be described as "offshore banks".
Offshore banking has often been associated with the underground economy and organized crime, tax evasion and money laundering; however, legally, offshore banking does not prevent assets from being subject to personal income tax on interest. Except for certain people who meet fairly complex requirements (such as perpetual travelers), the personal income tax laws of many countries (e.g., France, Malaysia, and the United States) make no distinction between interest earned in local banks and that earned abroad.
Persons subject to US income tax, for example, are required to declare, on penalty of perjury, any foreign bank accounts—which may or may not be numbered bank accounts—they may have. Although offshore banks may decide not to report income to other tax authorities and have no legal obligation to do so, as they are protected by bank secrecy, this does not make the non-declaration of the income by the taxpayer or the evasion of the tax on that income legal.
Following the 9/11 attacks, there have been many calls to increase regulation on international finance, in particular concerning offshore banks, tax havens, and clearing houses such as Clearstream, based in Luxembourg, which are possible crossroads for major illegal money flows.
Click on any of the following blue hyperlinks for more about Offshore Banks:
- Offshore banking comparison by jurisdictions
- Scope of offshore banking
- Banking advantages
- Banking disadvantages
- European crackdown
- Banking services
- Money laundering
- Regulation of international banks
- See also:
Discrimination in America Today
- YouTube Video: Eight black women discuss the politics of skin tone
- YouTube Video: What is sexual orientation discrimination? | Equality law: discrimination explained
- YouTube Video: The Cost of Gender Inequality
Discrimination is the act of someone being prejudiced towards another. This term is used to highlight the difference in treatment between members of different groups when one group is intentionally singled out and treated worse, or not given the same opportunities.
Attitudes toward minorities have been marked by discrimination historically in the United States. Many forms of discrimination have come to be recognized in U.S. society, on the basis of national origin, race, gender, and sexuality in particular.
History:
Racism:
Main articles: Racism in the United States and Racial inequality in the United States
Skin of Color is a form of racially-based discrimination where people are treated unequally due to skin color. It initially came about in America during slavery. Lighter skinned slaves tend to work indoors, while dark skinned worked outdoors.
In 1865, during the Reconstruction period after the Civil War, the Thirteenth Amendment to the United States Constitution was passed and it abolished slavery. This was soon followed by the Fourteenth Amendment to the United States Constitution that granted citizenship to all persons "born or naturalized in the United States", and the Fifteenth Amendment to the United States Constitution that protected the rights to vote for everyone.
These Amendments passed during the Reconstruction period extended protection to the newly emancipated slaves. However, in the 1870s Jim Crow laws were introduced in the Southeastern United States. These laws promoted the idea of "Separate but equal" which was first brought about from the Plessy v. Ferguson in 1896, meaning that all races were equal, but they had to have separate public facilities.
The mixing of races was illegal in most places such as public schools, public transportation and restaurants. These laws increased discrimination and segregation in the United States.
Often times, the products and sections designated for the "Colored" were inferior and not as nice for the "White Only". Water fountains, bathrooms, and park benches were just a few of the areas segregated by Caucasians due to Jim Crow laws. Furthermore, the Jim Crow laws systematically made life harder for African-Americans and people of color. It made voting harder to accomplish, due to the fact that African-Americans had to do literacy tests and go through other obstacles before getting the chance to vote.
In the modern United States, gay black men are extremely likely to experience intersectional discrimination. In the United States, the children of gay African-American men have a poverty rate of 52 percent, the highest in the country. Gay African-American men in partnerships are also six times more likely to live in poverty than gay white male couples.
Fighting back:
Major figures such as Martin Luther King Jr., Malcolm X, and Rosa Parks were involved in the fight against the race-based discrimination of the Civil Rights Movement. Rosa Parks's refusal to give up her bus seat in 1955 sparked the Montgomery bus boycott—a large movement in Montgomery, Alabama that was an integral period at the beginning of the Civil Rights Movement.
The Bus Boycott lasted a total of 381 days before the Supreme Court deemed that segregated seating is unconstitutional. Dr. Martin Luther King Jr., a peaceful activist and pastor, led many such protests, advocating for the improvement of African-Americans in American's society. His role in the Montgomery Bus Boycott helped to launch his role in the Civil Rights Movement. King organized many protests attended not only by African-American, but also Caucasians.
While King organized peaceful protests, Malcolm X went a different route. His main supporters, The Nation of Islam, and him stressed the idea of black power, and black pride.
Although Malcolm X's actions were radical, especially when they contradicted that of Dr. King, but he is still considered one of the pioneers in fighting back against racial discrimination in daily life and not just from a political standpoint. His ideas of black nationalism and the use of violence to fight back helped to spark the political group in the Black Panther Party for Self-Defense, which later became known as the Black Panther Party.
Formed by Bobby Seale and Huey P. Newton, the organization was created in October of 1966 in Oakland, California. Usually seen in all black and armed, as a group, the Black Panthers first started off patrolling police activity in Oakland, but soon grew to widespread support in cities like Los Angeles, and Chicago. Although they were seen as a violent gang and a danger to society, the Black Panthers brought numerous social programs such as free breakfast for school children and free clinics across numerous cities.
What the Black Panthers were fighting for was outlined in their Ten Point Program. They were ultimately taken down by the FBI, led by J. Edgar Hoover, in the early 1970s. Other factors such as internal tensions, and financial struggles also played into the demise of the Black Panther Party and by 1982 they were completely gone.
In the education system, the Civil Rights Movement further became huge after the ruling of Brown v. Board of Education in 1954. Oliver Brown challenged the Board of Education of Topeka, Kansas when his daughter was not allowed to enroll in any of the all white schools claiming that "separate but equal" violates the protection clause of the Fourteenth Amendment.
Ruby Bridges, along with the Little Rock Nine, dealt with discrimination from Caucasian peers, their parents, and the community in general during the desegregation of schools. The Little Rock Nine were a group of nine African-American students who volunteered to attend school at the Central High School in Little Rock, Arkansas. They continuously had problems with the public and faced harsh treatment from other students, parents, and even the Little Rock National Guard.
However, a change occurred when President Dwight D. Eisenhower intervened by sending federal troops to escort the students. For Ruby Bridges, she joined the Civil Rights Movement in the November 14, 1960 when she became enrolled in William Frantz Elementary School.
Due to many parents not allowing their children in her class, Bridges was in class by herself, which was taught by Barbara Henry, and often times ate alone and had recess alone. Ruby, along with her family, did face a lot of backlash throughout Louisiana from the desegregation; however, they did receive support from numerous people in the North and Bridges was able to finish the year.
Contemporary society:
Gender discrimination:
Further information:
Gender discrimination is another form of discrimination. Women are often seen as an expense to their employers because they take days off for children, need time off for maternity leave and are stereotyped as "more emotional". The theory that goes hand in hand with this is known as the glass escalator or the glass ceiling, which holds that while women are being held down in male-dominated professions, men often rise quickly to positions of authority in certain fields.
Men are pushed forward into management, even surpassing women who have been at the job longer and have more experience in the field.
Men's rights deals with discrimination against men in the areas of family law, such as divorce and child custody, labor such as paternity leave, paternity fraud, health, education, conscription, and other areas of the law such as domestic violence, genital integrity, and allegations of rape.
Discrimination against immigrants:
Immigrants to the United States are affected by a totally separate type of discrimination. Some people feel as though the large numbers of people being allowed into the country are cause for alarm, therefore discriminate against them.
Arizona recently passed a law that forces people to carry documents with them at all times to prove their citizenship. This is only one controversy over immigrants in the United States, another is the claim that immigrants are stealing "true Americans'" jobs.
Immigration restrictions are among the biggest government interventions in the economy. They prevent millions of people from taking jobs, renting homes, and pursuing a wide range of opportunities that they could otherwise have.
Violent hate crimes have increased drastically. Recent social psychological research suggests that this form of prejudice against migrants may be partly explained by some fairly basic cognitive processes.
According to Soylu, some argue that immigrants constantly face being discriminated against because of the color of their skin, the sound of their voice, the way they look and their beliefs. Many immigrants are well educated, some argue that they are often blamed and persecuted for the ills in society such as overcrowding of schools, disease and unwanted changes in the host country's culture due to the beliefs of this "unwelcomed" group of people.
According to Soylu, there was an open immigration policy up until 1924 in America until the National Origins Act came into effect. According to the Immigration Act of 1924 which is a United States federal law, it limited the annual number of immigrants who could be admitted from any country to 2% of the number of people from that country who were already living in the United States in 1890, down from the 3% cap set by the Immigration Restriction Act of 1921, according to the Census of 1890.
It superseded the 1921 Emergency Quota Act. The law was primarily aimed at further restricting immigration of Southern Europeans and Eastern Europeans. According to Buchanan, later in the 1930s with the advent of opinion polling, immigration policy analysis was carried out by collecting public thoughts and opinions on the issue. These factors encouraged a heated debate on immigration policy.
These debates continued even into the 2000s, and were intensified by George W. Bush's immigration proposal. Some argue that the 9/11 terrorist attacks left the country in a state of paranoia and fear that strengthened the views in favor of having closed borders.
Discrimination in the workplace:
Immigration to the United States can be difficult due to immigrants' lack of access to legal documents and the expensive nature of immigration. The United States has historically been a major target destination for people seeking work and continues to be so today.
As Graciela, a 47-year-old married woman who had lived in the US for 4 years, stated, “My husband,…he lost his job. Things were beginning to get tough…We came with the need to find work and better life possibilities.” Worldwide, the workforce has become increasingly pluralistic and ethnically diverse as more and more people migrate across nations.
Although race- or ethnicity-based discriminatory employment practices are prohibited in most developed countries, according to feminist scholar Mary Harcourt, actual discrimination is still widespread. Sahagian Jacqueline, an author, argues that one example of this act of discrimination occurred at Macy's a department store.
According to the U.S. Justice Department, Macy's used unfair documentation practices against legal immigrant employees who had proper work authorizations. During an eligibility re-verification process, Macy's broke immigration law that prohibits employers from discriminating against immigrant employees during re-verification by asking for more or different documents than other employees are required to submit based on a worker's immigration status or national origin.
Some of the affected employees lost seniority, were suspended, or even let go due to the illegal re-verification. While their opinions are controversial, researchers Moran, Tyler and Daranee argue that with immigrants' growing numbers and their expanding economic role in U.S. society, addressing challenges and creating opportunities for immigrants to succeed in the labor force are critical prerequisites to improve the economic security for all low-wage working families and ensure the future vitality of our economy.
Discrimination based on sexual orientation:
Another type of discrimination is that against lesbian, gay, bisexual, and transgender individuals. For personal reasons such as religious beliefs, employers sometimes choose to not hire these people. LGBT rights have been protested against for various reasons; for example, one topic of controversy related to LGBT people is marriage, which was legalized in all states in June 2015 following the Supreme Court case Obergefell v. Hodges.
See also:
Attitudes toward minorities have been marked by discrimination historically in the United States. Many forms of discrimination have come to be recognized in U.S. society, on the basis of national origin, race, gender, and sexuality in particular.
History:
Racism:
Main articles: Racism in the United States and Racial inequality in the United States
Skin of Color is a form of racially-based discrimination where people are treated unequally due to skin color. It initially came about in America during slavery. Lighter skinned slaves tend to work indoors, while dark skinned worked outdoors.
In 1865, during the Reconstruction period after the Civil War, the Thirteenth Amendment to the United States Constitution was passed and it abolished slavery. This was soon followed by the Fourteenth Amendment to the United States Constitution that granted citizenship to all persons "born or naturalized in the United States", and the Fifteenth Amendment to the United States Constitution that protected the rights to vote for everyone.
These Amendments passed during the Reconstruction period extended protection to the newly emancipated slaves. However, in the 1870s Jim Crow laws were introduced in the Southeastern United States. These laws promoted the idea of "Separate but equal" which was first brought about from the Plessy v. Ferguson in 1896, meaning that all races were equal, but they had to have separate public facilities.
The mixing of races was illegal in most places such as public schools, public transportation and restaurants. These laws increased discrimination and segregation in the United States.
Often times, the products and sections designated for the "Colored" were inferior and not as nice for the "White Only". Water fountains, bathrooms, and park benches were just a few of the areas segregated by Caucasians due to Jim Crow laws. Furthermore, the Jim Crow laws systematically made life harder for African-Americans and people of color. It made voting harder to accomplish, due to the fact that African-Americans had to do literacy tests and go through other obstacles before getting the chance to vote.
In the modern United States, gay black men are extremely likely to experience intersectional discrimination. In the United States, the children of gay African-American men have a poverty rate of 52 percent, the highest in the country. Gay African-American men in partnerships are also six times more likely to live in poverty than gay white male couples.
Fighting back:
Major figures such as Martin Luther King Jr., Malcolm X, and Rosa Parks were involved in the fight against the race-based discrimination of the Civil Rights Movement. Rosa Parks's refusal to give up her bus seat in 1955 sparked the Montgomery bus boycott—a large movement in Montgomery, Alabama that was an integral period at the beginning of the Civil Rights Movement.
The Bus Boycott lasted a total of 381 days before the Supreme Court deemed that segregated seating is unconstitutional. Dr. Martin Luther King Jr., a peaceful activist and pastor, led many such protests, advocating for the improvement of African-Americans in American's society. His role in the Montgomery Bus Boycott helped to launch his role in the Civil Rights Movement. King organized many protests attended not only by African-American, but also Caucasians.
While King organized peaceful protests, Malcolm X went a different route. His main supporters, The Nation of Islam, and him stressed the idea of black power, and black pride.
Although Malcolm X's actions were radical, especially when they contradicted that of Dr. King, but he is still considered one of the pioneers in fighting back against racial discrimination in daily life and not just from a political standpoint. His ideas of black nationalism and the use of violence to fight back helped to spark the political group in the Black Panther Party for Self-Defense, which later became known as the Black Panther Party.
Formed by Bobby Seale and Huey P. Newton, the organization was created in October of 1966 in Oakland, California. Usually seen in all black and armed, as a group, the Black Panthers first started off patrolling police activity in Oakland, but soon grew to widespread support in cities like Los Angeles, and Chicago. Although they were seen as a violent gang and a danger to society, the Black Panthers brought numerous social programs such as free breakfast for school children and free clinics across numerous cities.
What the Black Panthers were fighting for was outlined in their Ten Point Program. They were ultimately taken down by the FBI, led by J. Edgar Hoover, in the early 1970s. Other factors such as internal tensions, and financial struggles also played into the demise of the Black Panther Party and by 1982 they were completely gone.
In the education system, the Civil Rights Movement further became huge after the ruling of Brown v. Board of Education in 1954. Oliver Brown challenged the Board of Education of Topeka, Kansas when his daughter was not allowed to enroll in any of the all white schools claiming that "separate but equal" violates the protection clause of the Fourteenth Amendment.
Ruby Bridges, along with the Little Rock Nine, dealt with discrimination from Caucasian peers, their parents, and the community in general during the desegregation of schools. The Little Rock Nine were a group of nine African-American students who volunteered to attend school at the Central High School in Little Rock, Arkansas. They continuously had problems with the public and faced harsh treatment from other students, parents, and even the Little Rock National Guard.
However, a change occurred when President Dwight D. Eisenhower intervened by sending federal troops to escort the students. For Ruby Bridges, she joined the Civil Rights Movement in the November 14, 1960 when she became enrolled in William Frantz Elementary School.
Due to many parents not allowing their children in her class, Bridges was in class by herself, which was taught by Barbara Henry, and often times ate alone and had recess alone. Ruby, along with her family, did face a lot of backlash throughout Louisiana from the desegregation; however, they did receive support from numerous people in the North and Bridges was able to finish the year.
Contemporary society:
Gender discrimination:
Further information:
- Gender inequality in the United States,
- Pregnancy discrimination in the United States,
- and Gender pay gap in the United States
Gender discrimination is another form of discrimination. Women are often seen as an expense to their employers because they take days off for children, need time off for maternity leave and are stereotyped as "more emotional". The theory that goes hand in hand with this is known as the glass escalator or the glass ceiling, which holds that while women are being held down in male-dominated professions, men often rise quickly to positions of authority in certain fields.
Men are pushed forward into management, even surpassing women who have been at the job longer and have more experience in the field.
Men's rights deals with discrimination against men in the areas of family law, such as divorce and child custody, labor such as paternity leave, paternity fraud, health, education, conscription, and other areas of the law such as domestic violence, genital integrity, and allegations of rape.
Discrimination against immigrants:
Immigrants to the United States are affected by a totally separate type of discrimination. Some people feel as though the large numbers of people being allowed into the country are cause for alarm, therefore discriminate against them.
Arizona recently passed a law that forces people to carry documents with them at all times to prove their citizenship. This is only one controversy over immigrants in the United States, another is the claim that immigrants are stealing "true Americans'" jobs.
Immigration restrictions are among the biggest government interventions in the economy. They prevent millions of people from taking jobs, renting homes, and pursuing a wide range of opportunities that they could otherwise have.
Violent hate crimes have increased drastically. Recent social psychological research suggests that this form of prejudice against migrants may be partly explained by some fairly basic cognitive processes.
According to Soylu, some argue that immigrants constantly face being discriminated against because of the color of their skin, the sound of their voice, the way they look and their beliefs. Many immigrants are well educated, some argue that they are often blamed and persecuted for the ills in society such as overcrowding of schools, disease and unwanted changes in the host country's culture due to the beliefs of this "unwelcomed" group of people.
According to Soylu, there was an open immigration policy up until 1924 in America until the National Origins Act came into effect. According to the Immigration Act of 1924 which is a United States federal law, it limited the annual number of immigrants who could be admitted from any country to 2% of the number of people from that country who were already living in the United States in 1890, down from the 3% cap set by the Immigration Restriction Act of 1921, according to the Census of 1890.
It superseded the 1921 Emergency Quota Act. The law was primarily aimed at further restricting immigration of Southern Europeans and Eastern Europeans. According to Buchanan, later in the 1930s with the advent of opinion polling, immigration policy analysis was carried out by collecting public thoughts and opinions on the issue. These factors encouraged a heated debate on immigration policy.
These debates continued even into the 2000s, and were intensified by George W. Bush's immigration proposal. Some argue that the 9/11 terrorist attacks left the country in a state of paranoia and fear that strengthened the views in favor of having closed borders.
Discrimination in the workplace:
Immigration to the United States can be difficult due to immigrants' lack of access to legal documents and the expensive nature of immigration. The United States has historically been a major target destination for people seeking work and continues to be so today.
As Graciela, a 47-year-old married woman who had lived in the US for 4 years, stated, “My husband,…he lost his job. Things were beginning to get tough…We came with the need to find work and better life possibilities.” Worldwide, the workforce has become increasingly pluralistic and ethnically diverse as more and more people migrate across nations.
Although race- or ethnicity-based discriminatory employment practices are prohibited in most developed countries, according to feminist scholar Mary Harcourt, actual discrimination is still widespread. Sahagian Jacqueline, an author, argues that one example of this act of discrimination occurred at Macy's a department store.
According to the U.S. Justice Department, Macy's used unfair documentation practices against legal immigrant employees who had proper work authorizations. During an eligibility re-verification process, Macy's broke immigration law that prohibits employers from discriminating against immigrant employees during re-verification by asking for more or different documents than other employees are required to submit based on a worker's immigration status or national origin.
Some of the affected employees lost seniority, were suspended, or even let go due to the illegal re-verification. While their opinions are controversial, researchers Moran, Tyler and Daranee argue that with immigrants' growing numbers and their expanding economic role in U.S. society, addressing challenges and creating opportunities for immigrants to succeed in the labor force are critical prerequisites to improve the economic security for all low-wage working families and ensure the future vitality of our economy.
Discrimination based on sexual orientation:
Another type of discrimination is that against lesbian, gay, bisexual, and transgender individuals. For personal reasons such as religious beliefs, employers sometimes choose to not hire these people. LGBT rights have been protested against for various reasons; for example, one topic of controversy related to LGBT people is marriage, which was legalized in all states in June 2015 following the Supreme Court case Obergefell v. Hodges.
See also:
- Age discrimination in the United States
- Black flight
- Civil Rights
- Civil Union
- Employment discrimination in the United States
- Housing discrimination in the United States
- Housing segregation
- Mortgage discrimination
- Racial inequality in the United States
- Racial steering
- Religious discrimination in the United States
- White flight
Social Status
- YouTube Video: How to motivate yourself to change your behavior
- YouTube Video: Why being respectful to your coworkers is good for business
- YouTube Video: How to stay calm when you know you'll be stressed
Social status is the relative level of respect, honor, assumed competence, and deference accorded to people, groups, and organizations in a society. Some writers have also referred to a socially valued role or category a person occupies as a "status" (e.g., gender, race, having a criminal conviction, etc.).
Status is based in beliefs about who members of a society believe holds comparatively more or less social value. By definition, these beliefs are broadly shared among members of a society. As such, people use status hierarchies to allocate resources, leadership positions, and other forms of power. In doing so, these shared cultural beliefs make unequal distributions of resources and power appear natural and fair, supporting systems of social stratification.
Status hierarchies appear to be universal across human societies, affording valued benefits to those who occupy the higher rungs, such as better health, social approval, resources, influence, and freedom.
Status hierarchies depend primarily on the possession and use of status symbols. These are cues people use to determine how much status a person holds and how they should be treated.
Such symbols can include the possession of socially valuable attributes, like being conventionally beautiful or having a prestigious degree. Other status symbols include wealth and its display through conspicuous consumption. Status in face-to-face interaction can also be conveyed through certain controllable behaviors, such as assertive speech, posture, and emotional displays.
Click on any of the following blue hyperlinks for more about Social Status:
Status is based in beliefs about who members of a society believe holds comparatively more or less social value. By definition, these beliefs are broadly shared among members of a society. As such, people use status hierarchies to allocate resources, leadership positions, and other forms of power. In doing so, these shared cultural beliefs make unequal distributions of resources and power appear natural and fair, supporting systems of social stratification.
Status hierarchies appear to be universal across human societies, affording valued benefits to those who occupy the higher rungs, such as better health, social approval, resources, influence, and freedom.
Status hierarchies depend primarily on the possession and use of status symbols. These are cues people use to determine how much status a person holds and how they should be treated.
Such symbols can include the possession of socially valuable attributes, like being conventionally beautiful or having a prestigious degree. Other status symbols include wealth and its display through conspicuous consumption. Status in face-to-face interaction can also be conveyed through certain controllable behaviors, such as assertive speech, posture, and emotional displays.
Click on any of the following blue hyperlinks for more about Social Status:
- Determination
- In different societies
- In nonhuman animals
- Status inconsistency
- Inborn and acquired
- Social mobility
- Social stratification
- Max Weber's three dimensions of stratification:
- See also:
Ideology, including a List of Political Ideologies
- YouTube Video: Political Ideology: Crash Course Government and Politics
- YouTube Video: Political Culture and Ideology
- YouTube Video: Ideologies of political parties in the United States | US government and civics | Khan Academy
An ideology is a collection of normative beliefs and values that an individual or group holds for other than purely epistemic reasons. In other words, these rely on basic assumptions about reality that may or may not have any factual basis.
The term is especially used to describe systems of ideas and ideals which form the basis of economic or political theories and resultant policies. In these there are tenuous causal links between policies and outcomes owing to the large numbers of variables available, so that many key assumptions have to be made. In political science the term is used in a descriptive sense to refer to political belief systems.
The term was coined by Antoine Destutt de Tracy, a French Enlightenment aristocrat and philosopher, who conceived it in 1796 as the "science of ideas" during the French Reign of Terror by trying to develop a rational system of ideas to oppose the irrational impulses of the mob.
However, in contemporary philosophy it is narrower in scope than that original concept, or the ideas expressed in broad concepts such as worldview, The Imaginary and in ontology.
In the sense defined by French Marxist philosopher Louis Althusser, ideology is "the imagined existence (or idea) of things as it relates to the real conditions of existence".
Click on any of the following blue hyperlinks for more about Political Ideology:
The term is especially used to describe systems of ideas and ideals which form the basis of economic or political theories and resultant policies. In these there are tenuous causal links between policies and outcomes owing to the large numbers of variables available, so that many key assumptions have to be made. In political science the term is used in a descriptive sense to refer to political belief systems.
The term was coined by Antoine Destutt de Tracy, a French Enlightenment aristocrat and philosopher, who conceived it in 1796 as the "science of ideas" during the French Reign of Terror by trying to develop a rational system of ideas to oppose the irrational impulses of the mob.
However, in contemporary philosophy it is narrower in scope than that original concept, or the ideas expressed in broad concepts such as worldview, The Imaginary and in ontology.
In the sense defined by French Marxist philosopher Louis Althusser, ideology is "the imagined existence (or idea) of things as it relates to the real conditions of existence".
Click on any of the following blue hyperlinks for more about Political Ideology:
- Etymology and history
- Analysis
- Political ideologies
- Epistemological ideologies
- Psychological research
- Ideology and semiotic theory
- Sociological uses
- Quotations
- See also:
- The Anatomy of Revolution
- List of communist ideologies
- Capitalism
- Feminism
- Hegemony
- -ism
- List of ideologies named after people
- Ideocracy
- Noble lie
- Social criticism
- Socially constructed reality
- State collapse
- State ideology of the Soviet Union
- The True Believer
- World Values Survey
- World view
- The Pervert's Guide to Ideology: How Ideology Seduces Us—and How We Can (Try to) Escape It
- Ideology Study Guide
- Toll, Mathew (2009), Ideology and Symbolic Power: Between Althusser and Bourdieu
Anti-discrimination Laws, including a List of Anti-Discrimination Laws in the United States
- YouTube Video: How Movies Help Us Understand Discrimination
- YouTube Video: Color blind or color brave? | Mellody Hobson
- YouTube Video: Why Gender Equality Is Good for Everyone — Men Included | Michael Kimmel | TED Talks
Anti-discrimination law or non-discrimination law refers to legislation designed to prevent discrimination against particular groups of people; these groups are often referred to as protected groups or protected classes.
Anti-discrimination laws vary by jurisdiction with regard to the types of discrimination that are prohibited, and also the groups that are protected by that legislation. Commonly, these types of legislation are designed to prevent discrimination in employment, housing, education, and other areas of social life, such as public accommodations.
Anti-discrimination law may include protections for groups based on any of the following:
Anti-discrimination laws are rooted in principles of equality, specifically, that individuals should not be treated differently due the characteristics outlined above.
Anti-discrimination laws are designed to protect against both individual discrimination (committed by individuals) and from structural discrimination (arising from policies or procedures that disadvantage certain groups).
Courts may take into account both discriminatory intent and disparate impact in determining whether a particular action or policy constitutes discrimination.
Click on any of the following blue hyperlinks fore more about Anti-discrimination Laws:
The following is a List of Anti-discrimination Laws in the United States:
Anti-discrimination laws vary by jurisdiction with regard to the types of discrimination that are prohibited, and also the groups that are protected by that legislation. Commonly, these types of legislation are designed to prevent discrimination in employment, housing, education, and other areas of social life, such as public accommodations.
Anti-discrimination law may include protections for groups based on any of the following:
- sex,
- age,
- race,
- ethnicity,
- nationality,
- disability,
- mental illness or ability,
- sexual orientation,
- gender,
- gender identity/expression,
- sex characteristics,
- religious,
- creed,
- or individual political opinions.
Anti-discrimination laws are rooted in principles of equality, specifically, that individuals should not be treated differently due the characteristics outlined above.
Anti-discrimination laws are designed to protect against both individual discrimination (committed by individuals) and from structural discrimination (arising from policies or procedures that disadvantage certain groups).
Courts may take into account both discriminatory intent and disparate impact in determining whether a particular action or policy constitutes discrimination.
Click on any of the following blue hyperlinks fore more about Anti-discrimination Laws:
- International
- History of anti-discrimination legislation
- Effects
- Exceptions
- See also:
- Labor law
- Discrimination (Employment and Occupation) Convention (ILO Convention No. 111)
- Anti-discrimination laws in Brazil
- Employment equity (Canada)
- Employment discrimination law in the United States
- Employment discrimination law in the United Kingdom
- History of women in the military
- LGBT rights by country or territory
- Public accommodations
- Reasonable accommodation
The following is a List of Anti-discrimination Laws in the United States:
- Age Discrimination Act of 1975
- Age Discrimination in Employment Act of 1967
- Alaska's Anti-Discrimination Act of 1945
- Americans with Disabilities Act of 1990
- California Fair Employment and Housing Act
- Civil Rights Act of 1866
- Civil Rights Act of 1871
- Civil Rights Act of 1957
- Civil Rights Act of 1964
- Civil Rights Act of 1968
- Civil Rights Act of 1991
- Employment Non-Discrimination Act
- Equal Pay Act of 1963
- Executive Order 11478
- Executive Order 13166 – “Improving Access to Services for Persons with Limited English Proficiency”
- Fair Employment Act of 1941
- Family & Medical Leave Act of 1993 - enables qualified employees to take prolonged unpaid leave for family and health-related reasons without fear of losing their jobs. For private employers with 15 or more employers
- Fourteenth Amendment to the United States Constitution
- Genetic Information Nondiscrimination Act
- Homeless Bill of Rights
- Lloyd–La Follette Act (1912)
- Lilly Ledbetter Fair Pay Act of 2009
- Massachusetts Gender Identity Anti-Discrimination Initiative
- New Jersey Anti-Bullying Bill of Rights Act
- No-FEAR Act
- Voting Rights Act of 1965
- Pregnancy Discrimination Act of 1978
- Rehabilitation Act of 1973
Immigration Policies of Donald Trump, including Illegal Immigration to the United States, along with the NY Times (12/11/2018) article "8 Million People Are Working Illegally in the U.S. Here’s Why That’s Unlikely to Change."
- YouTube Video 'People Are Kept In Cages': Inside Border Patrol Center | Morning Joe | MSNBC
- YouTube Video Six Children Have Died While in ICE Custody - Why?
- YouTube Video: Trump calls some illegal immigrants "animals" in meeting with sheriffs
[Your WebHost: 3/10/2021 Update: While this topic may not seem irrelevant anymore since Trump was ousted with the 2021 elections, we leave this in to ensure that America never chooses another racist President like Trump.]
Donald Trump wants to deport every single illegal immigrant - could he?
(BBC 11/11/2015)
US Republican presidential candidate Donald Trump wants to deport every illegal immigrant from the United States. The other Republican candidates say it can't be done - one called it a "silly argument".
And the majority of US Republican voters disagree with Mr Trump: according to a 2015 survey by the Pew Research Center, 56% believe undocumented immigrants should be allowed to stay if they meet certain criteria.
So who's right? And what would happen if US authorities attempted to carry out Mr Trump's audacious plan?
A huge task:
There are approximately 11.3 million undocumented immigrants in the US. Rounding them up and deporting them would present a huge logistical and financial challenge to America's military, law enforcement, and border control agencies.
Mr Trump hasn't set out a timeframe for his mass deportation strategy, but a 2015 study by the American Action Forum (AAF), a conservative think tank, estimates it would take about 20 years to find and deport that many people.
Using good old-fashioned American school buses, 650 would have to run every month, without a seat to spare, for two decades. Plus continuous operations from a variety of law enforcement and other government bodies - with all the cost that entails.
So how much is that? Based on an analysis for 5 million people, the Centre for American Progress estimates that a mass deportation from the US would cost an average of $10,070 (£6,624) per person. For 11.3 million people, that's $114bn (£75bn).
And that would cover only the basic operational costs - apprehension, detention, legal processing, and deportation. According to the AAF, the total cost of a 20-year mass deportation programme would be somewhere between $420 and $620 billion.
But we're not finished yet, there's still the impact on the economy. The AAF report, published earlier this year, estimates that undocumented immigrants made up 6.4% of the country's labour force - about 11 million workers - in 2014.
It predicts that deporting all of those workers would shrink the US economy by nearly 6%, or $1.6 trillion, by 2035.
That's not to mention the enormous potential for lawsuits and reparations claims filed against the government.
And what about... society?This massive deportation programme would have to be done with the support - or at least tacit consent - of the American people, many of whom will have lived and worked with, befriended and loved undocumented immigrants.
According to a 2013 study by Pew, illegal immigrant adults had been in the country for a median of 13 years at the time the study was carried out.
Would ordinary Americans turn a blind eye while neighbors, colleagues and friends were rounded up and taken away? Or would it precipitate mass civil unrest? In 2010, Arizona introduced a law that allowed police to check the legal status of anyone they suspected of being an illegal immigrant, and 100,000 people hit the streets to protest.
And then there is the thorny issue of how this would all look. In an age when nearly everyone has a video camera in their pocket, could soldiers really round people up - young and old, entire families - and force them on to buses and trains? Would the soldiers have machine guns and dogs? Could the average American stomach those images, with all their attendant historical echoes?
Are there any other options?The majority of US citizens - especially Hispanics, younger Americans and Democrats - support a path to either citizenship or permanent residency for undocumented immigrants.
Under plans first put forward by President Obama in 2014, about five million undocumented immigrants would have been allowed to apply for work permits and eventually permanent residency.
The program would have shielded immigrants who have been in the US since 2010, have not been convicted of a serious crime and have ties to US citizens.
Hillary Clinton, the Democratic candidate to succeed Mr Obama, supported the plan and pledged to expand it. Mr Trump made it clear he was firmly against the idea.
But Mr Obama's plan was rejected by Congress and then by the Supreme Court.
[End of BBC Article]
___________________________________________________________________________
New York Times (12/11/2018) Article "8 Million People Are Working Illegally in the U.S. Here’s Why That’s Unlikely to Change."
They make beds in inns across the country. They pick oranges in Florida, strawberries in California and vegetables in Ohio. And they have built new subdivisions in Phoenix, Atlanta and Charlotte.
For years, policymakers have talked about shutting off the influx of undocumented workers.
But the economy has grown to rely on them.
Ending illegal immigration, say many of those who have studied the issue, could mean that American workers would lose their jobs, companies would close and the economy would contract.
In recent years, though, border security has tightened considerably, a strong economy has driven down unemployment, and many employers, particularly those offering low-paid jobs, say there are few alternatives to hiring workers without legal documents.
President Trump, it turns out, is caught on both sides of the balance between border security and economic prosperity.
The president has vowed to erect a wall to keep out undocumented immigrants and has ramped up the deportation of those already in the United States. His administration has conducted payroll audits and workplace raids, which have resulted in the arrest of thousands of workers.
But four undocumented workers have recently come forward at the Trump National Golf Club in Bedminster, N.J., and the federal E-Verify database suggests that the Trump Organization does not use heightened employment document verification procedures at several other of its properties across the country, meaning that the chances of employing undocumented workers are high.
Like undocumented workers across the country, the former Bedminster employees interviewed by The New York Times said they used counterfeit Social Security and green cards to get hired.
The Trump Organization has vowed to terminate any undocumented workers it finds on its payroll, and the fate of any of its workers who do not have legal working papers remains unclear.
What is clear, however, is that at a time of extremely low unemployment, 3.7 percent nationally, Mr. Trump’s golf club might struggle to recruit legal workers to replace any undocumented workers who are terminated.
Most undocumented immigrants are in the labor forceAbout eight million of the nearly 11 million immigrants unlawfully in the United States — down from a high of 12.2 million in 2007 — participate in the labor force. They account for about 5 percent of all workers, according to the Pew Research Center.
“Our economy has absorbed these workers and employers would like more of them, given the low unemployment rate,” said Madeline Zavodny, an economist at the University of North Florida who is an expert on the economics of immigration.
Undocumented immigrants are over-represented in low-skilled jobs such as farming, construction and child care,e.g., unauthorized workers represent about 15 percent of those employed in construction.
Often, these are jobs their employers have trouble filling with American workers.
Anabele Garcia, an undocumented immigrant from Mexico, toils in the vineyards of Sonoma County in California, earning about $15 an hour. When the season ends each year, she finds work cleaning houses and wine estates, earning about $20 an hour. Her husband, Jorge Romero, works in the cow pastures nearby.
“We are here to do any work,” said Ms. Garcia, 39. “There are no Americans in the fields.”
Raising wages is not a catchall solutionWhat would happen if all the undocumented immigrants went away?
Steve Camarota, research director at the Center for Immigration Studies, which supports curbs on immigration, believes that wages would rise and motivate many chronically unemployed Americans to get back to work.
But wage rates are not the main issue, some economists say, because there still would not be enough Americans willing to do blue-collar jobs.
Expectations and status play a role, said Chris Tilly, a labor economist at the Luskin School of Public Affairs at the University of California, Los Angeles. “Not everybody will do dirty work,” he said.
They might prefer to make a low wage working inside an Amazon distribution center to putting shingles on a roof.
A survey conducted in late 2017 by the Associated Contractors of America found that 70 percent of construction companies were having difficulty hiring roofers, bricklayers and electricians, among others. The accommodation and food services sector reported a record number of vacancies this October.
A closed border could mean a shrinking economy:
Historically, the regulation of the border with Mexico, the main source of migration, “has always been driven by the needs of the economy,” Mr. Tilly said.
That’s less true now, under the Trump administration, which has sought to check illegal border crossings by all means possible.
Giovanni Peri, an economist who studies immigration labor at the University of California, Davis, said that with a true cutoff in illegal immigration, the economy would contract. The impact, he said, would fall not just on immigrants — because their work sustains sectors that employ many Americans.
“Some sectors, like construction, agriculture, housing and personal services would be drastically reduced,” Mr. Peri said. “There would be companies closing and relocating. There would be jobs lost. There will be towns and cities that would see half their population disappear.”
“It definitely would trigger a recession,” he said. “We are talking about a lot of job loss.”
It is very unlikely that weak, vulnerable American workers would benefit from jobs previously held by immigrants because some of these jobs themselves would disappear, he said.
Some Americans are unemployed for a reason:
“Very few of the jobs these immigrants have would be taken by these Americans,” Mr. Peri said. “The ones who are not employed have complicated circumstances like drug addiction, alcohol addiction or criminal records.”
Especially at a time of low unemployment, he added, “This would be the worst time to lose them. There are no unemployed Americans ready to do their jobs.”
In some sectors, “there might be people who would do these jobs at much higher wages,” said Ms. Zavodny, the economist in Florida. “But it is not clear those jobs would exist at much higher wages.”
Agriculture, construction and service would be hard-hit:
The agriculture industry has begun to invest in automation and robotics to compensate for a worsening labor shortage.
In the lettuce fields of California’s Salinas Valley, a new machine plies row after row of romaine lettuce, doing the backbreaking work, long performed by people, of lobbing heads of romaine lettuce from the field. It saves time and human labor.
Still, more than half of all field workers are undocumented, according to the Farm Bureau, which has said that their sudden disappearance would deal a catastrophic blow to American agriculture.
Since the 1990s, undocumented immigrants from Mexico and Central America have flocked to towns like Dalton, Ga., to work in the carpet mills. Across the South and in fast-growing cities like Denver, hundreds of thousands have been absorbed by the construction industry as roofers, painters and bricklayers.
Unauthorized immigrants in 2016 represented 10.6 percent of the labor force in Nevada, 8.6 percent in California and 8.2 percent in Texas, according to a study released last month by the Pew Research Center.
In states like Georgia and North Carolina, their presence has grown rapidly to represent 5.4 percent and 4.5 percent, respectively, of the labor force.
In all but four states, service occupations, such as being a waiter, dishwasher or maid, together draw the largest number of undocumented immigrants, the Pew report found.
About 31 percent of all undocumented immigrant workers were in service occupations in 2016, according to the estimates, which were based on data gathered by the Census Bureau.
Unauthorized immigrants represent about 24 percent of all workers in farming, fishing and forestry and 15 percent of those employed in construction, which is the industry that uses the most undocumented immigrant workers overall, at 1.35 million.
Nearly one quarter of restaurant workers in 2016 were foreign-born compared with 18.5 percent for all sectors, according to data from the Bureau of Labor Statistics, compiled by the National Restaurant Association. A large share are likely undocumented, economists say.
“These workers are often long-tenured and skilled,” said Craig Regelbrugge, senior vice president of industry advocacy and research at AmericanHort, which represents the nursery industry. “They are nothing short of vital to farms, businesses, and rural economies.”
“Each job they perform sustains two to three jobs in the surrounding economy, so even though few Americans seek this field and farm work, the jobs of many Americans and many communities are sustained by their contributions.”
[End of NY Times Article]
___________________________________________________________________________
Immigration policy of Donald Trump
Immigration policy and, specifically, illegal immigration to the United States, was a signature issue of U.S. President Donald Trump's presidential campaign, and his proposed reforms and remarks about this issue generated much publicity.
Trump has repeatedly said that some illegal immigrants are criminals. A 2015 study by the non-partisan Migration Policy Institute had concluded that around 820,000 unauthorized immigrants had criminal records; however, increasing evidence indicates that immigration does not correlate with higher crime rates.
A hallmark promise of his campaign was to build a substantial wall on the United States–Mexico border and to force Mexico to pay for the wall. Trump has also expressed support for a variety of "limits on legal immigration and guest-worker visas", including a "pause" on granting green cards, which Trump says will "allow record immigration levels to subside to more moderate historical averages".
Trump's proposals regarding H-1B visas frequently changed throughout his presidential campaign, but as of late July 2016, he appeared to oppose the H-1B visa program.
As president, Trump imposed a travel ban that prohibited issuing visas to citizens of seven largely-Muslim countries. In response to legal challenges he revised the ban twice, with his third version being upheld by the Supreme Court in June 2018.
He attempted to end the Deferred Action for Childhood Arrivals program, but a legal injunction has allowed the policy to continue while the matter is the subject of legal challenge. He imposed a "zero tolerance" policy to require the arrest of anyone caught illegally crossing the border, which resulted in separating children from their families.
Tim Cook and 58 other CEOs of major American companies warned of harm from Trump's immigration policy. The "zero tolerance" policy was reversed in June 2018, but multiple media reports of continued family separations were published in the first half of 2019.
In his first State of the Union address on January 30, 2018, Trump outlined his administration's four pillars for immigration reform:
(1) a path to citizenship for DREAMers;
(2) increased border security funding;
(3) ending the diversity visa lottery;
(4) restrictions on family-based immigration.
The Four Pillars reinforce Trump's campaign slogan to "Buy American, Hire American" and 2017 executive order by the same name, and tracks with previously outlined immigration policy priorities.
Click on any of the following blue hyperlinks for more about the Immigration Policies of Donald Trump:
Myths and facts about immigration to the United States
Excerpt from The New Colossus, as engraved at the Statue of Liberty, greeting millions of immigrants at Ellis Island:
Anti-immigration proponents in the United States make many claims about illegal immigration from Mexico that are based on exaggerations, misconceptions, myths and outright lies.
Some of their complaints may be true on a very small scale, but by and large, most of their assertions do not reflect the actual trends in illegal immigration into the U.S.
Further, those who oppose illegal immigration tend to focus their rhetoric on appeals to emotion with scant evidence, such as claiming illegal immigrants are "taking our jobs" and "threatening our security".
Their only "solution" seems to be, "kick out the wetbacks and build a wall" (which would actually cause major damage to the U.S. economy).
Click on any of the following blue hyperlinks for more about "The Myths and the Facts About Immigration to the United States":
U.S. Immigration and Customs Enforcement (ICE)
U.S. Immigration and Customs Enforcement, commonly known by its initials ICE, is an agency of the Department of Homeland Security tasked with enforcing the United States' immigration laws.
It should be noted that ICE does not patrol the US borders; that task is handled by its sister agency, the United States Customs and Border Protection. Instead, ICE is responsible for arresting, detaining, and deporting undocumented immigrants already within the United States.
Under the presidency of Donald Trump, the agency has become unreasonably aggressive in its task of removing unwanted individuals to the point that it has become responsible for an ongoing human rights fiasco.
It wasn't all Trump's fault, however; ICE's abuses against undocumented immigrants happened under President Obama's watch as well.
ICE was established in 2003 by the Homeland Security Act, which created the Department of Homeland Security and integrated the former Immigration and Naturalization Service into what is now ICE.
Abuses:
Family separation policy and child detention centers:
As part of the Trump Administration's zero-tolerance immigration policy, the family separation policy of migrant children from their parents was officially adopted from April 2018 to June 2018, although it was found that the practice had begun a year prior to the policy's official announcement.
As of December 2018, roughly 15,000 children are currently detained within U.S. custody.
Rape and sexual assault:
ICE has a long history of rape and sexual assault within the agency.
In April 2018, The Intercept reported they had obtained 1,224 complaints of sexual assault filed between 2010 and September 2017, but DHS officials indicated they had received roughly 33,000 complaints between 2010 and 2016 that had not been shared with The Intercept. In February 2019, government documents revealed that over 5,800 migrant children faced alleged sexual abuse while in ICE custody.
Inhumane conditions:
Another report from The Intercept found that at the ICE processing center in Adelanto, California, the staff had been ignoring medical emergencies, the food was infested with maggots, and detainees working in the kitchens were being paid only $1 a day.
A report from the DHS Office of Inspector General reported guards mocking survivors of attempted suicide as "suicide failures", inadequate sanitation and food services within the facility, medical staff neglecting detainees, and staff members refusing to allow religious practices, telephone access and visitation.
Another DHS report found that the facility in Newark, New Jersey had kitchens serving raw, spoiled and expired meat and storing moldy bread, leaking roofs in every housing unit, flat and dilapidated bed mattresses held together with tied sheets, and showers with mold, mildew and peeling paint.
Additionally, in response to hunger strikes by eleven detainees at the processing center in El Paso, Texas, ICE officers began force-feeding six of them with nasal tubes, resulting in nosebleeds and vomiting. Immigrant children at Shenandoah Valley Juvenile Center in Virginia have said that they were beaten, left in their cells naked, and strapped to chairs by the guards.
ICE regularly inflicts solitary confinement on detainees at the slightest provocation, confining people to tiny cells for 23 hours a day and letting them pace around inside what amounts to a cage for one hour.
A policy adviser at the DHS revealed ICE had been engaging in infliction punishment on gay detainees, transgender detainees, disabled detainees detainees with mental illnesses, and detainees reporting abuses committed by ICE staff, with only half the cases being for rule violations.
The UN has declared that solitary confinement in excess of 15 days or more should be banned, as it is a form of torture. Despite this, the International Consortium of Investigative Journalists analyzed more than 8,400 records of ICE using solitary confinement on detainees, finding that more than half of the cases were in excess of 15 days, 573 were in excess of 90 days, 32 stints lasted more than a year (!), and a whopping one-third of detainees placed in solitary were mentally ill (373 of whom had to be placed on suicide watch).
Deaths in detention:
Under the Freedom of Information Act, The New York Times and the American Civil Liberties Union obtained documents revealing 107 deaths occurring in ICE custody from October 2003 to 2007. These deaths included:
Since 2007, that number has gone up to 188, with 22 deaths occurring in the last two years.
Of the 24 individuals to die during Trump's term in office, it has been concluded that at least six of those deaths were caused by the agency's failure to provide adequate medical care.
Trump has exacerbated the problem of deaths occurring in ICE custody. 2017, his first year in office, saw 12 detainees die in ICE custody, the greatest number since 2009, Obama's first year in office
Clara Long, a senior researcher for the Human Rights Watch has said that, "ICE has proven unable or unwilling to provide adequately for the health and safety of the people it detains. The Trump administration’s efforts to drastically expand the already-bloated immigration detention system will only put more people at risk."
Religious freedom violations:
See the main article on this topic: Freedom of religion
While the Trump administration touts its commitment to "religious freedom," both ICE and the Border Patrol frequently and intentionally violate the religious rights of their detainees.
Egregious examples of these abuses include:
This conduct is not limited to Muslims and Sikhs; Christians as well are consistently denied access to religious materials, services, and clergy. There are also reports of guards mocking Christian detainees for praying.
These violations of fundamental rights are unnecessary for any security purpose and appear to be intended as a means of harassment and a means of pressuring detainees to give up on their case and accept immediate deportation rather than to wait out the results of their case.
The US government has previously acknowledged how important religion can be for prisoner morale, mental health, and rehabilitation; what ICE is doing is illegal under the Religious Land Use and Institutionalized Persons Act of 2000, which guarantees religious rights for all prisoners in the United States, whether documented or otherwise.
Incompetence and corruption:
In February 2019, the DHS Office of Inspector General reported that ICE had not been holding private contractors accountable for issues at their detention centers, issuing only two fines in 14,003 cases between October 2015 and June 2018 that included sexual assault, use of tear gas and so on.
ICE has also proven to be dangerously vindictive. When citizens in several North Carolina counties elected sheriffs who agreed to the the termination of 287(g) cooperation agreements with ICE that notify federal agents of the immigration status of detained individuals, ICE retaliated with a five-day crackdown in those counties' 200 people, 60 of whom ICE had previously held no interest in.
This was done to cause maximum chaos in order to punish the citizens and sheriffs of those counties for refusing to comply with ICE's draconian anti-immigrant measures. ICE also punishes detainees who have the temerity to cooperate in legal cases against them.
Such was the case of several Iraqi detainees, who were ACLU clients, in Arizona who were told they were about to be released, only for their guards to reveal mid-charade that they were actually only being sent to a different ICE facility. This was an intentional and cruel form of psychological punishment.
ICE's American police state:
Perhaps the most frightening aspect of ICE's enforcement procedures for everyday US citizens is their policy of launching mass raids against anyone who looks too brown to be a real American. This has been a fact of ICE since the mid-Bush administration.
In 2006 ICE started using the excuse of investigating "document fraud" to raid workplaces across the country with ridiculously heavily armed agents. President George W. Bush publicly announced his desire to purge America of all illegal immigrants, directly causing 2008 to be a record year for workplace raids, with 6,287 arrests.
Obama issued a series of executive orders to focus ICE on dangerous criminals, but Trump's "zero tolerance" policy reversed that to target potentially any illegal immigrant.
ICE raids are traumatic events for the communities they target, as ICE rushes in with military equipment and helicopters to whisk away hundreds of people, leaving jobs empty, families torn apart, and local organizations straining to return things to normality.
Although it seems logical that ICE should first go after dangerous criminals, the agency instead targets so-called "low hanging fruit": people without criminal records who visit ICE for routine check-ins in the hope of earning citizenship or testify against criminals in court. The effect of ICE raids is widespread fear among immigrant communities and a complete withdrawal from civil and social life.
ICE also runs roughshod over the law. In the wake of Trump's "zero tolerance" policy, ICE and CBP agents have taken the law into their own hands by ignoring and violating court orders, denying access to legal counsel, and forcing people to sign documents they have not read.
ICE also arrests DACA beneficiaries who are here legally and conducts unwarranted raids in defiance of the Fourth Amendment. Increasing aggressiveness in raiding has resulted in ICE casting a pall over immigrant and minority communities with the threat of "collateral arrests", or arrests of individuals not actually targeted by the raid but who happen to have the wrong skin color.
Even more absurdly, ICE has taken to arresting people while they are in the process of applying for citizenship, a counterproductive policy that discourages people from complying with the law.
Even legal immigrants are now afraid to visit government buildings, as they are afraid that the ICE agents loitering outside are there for them, too.
ICE uses intimidation tactics against US citizens associated with their targets in the hopes of circumventing constitutional rights and getting away with misrepresenting the law.
Widespread racial profiling and aggressive raiding has led to ICE mistakenly arresting almost 1,500 legal United States citizens as of 2018, with ICE agents ignoring and mocking detainees' rightful claims of citizenship.
See also:
Donald Trump wants to deport every single illegal immigrant - could he?
(BBC 11/11/2015)
US Republican presidential candidate Donald Trump wants to deport every illegal immigrant from the United States. The other Republican candidates say it can't be done - one called it a "silly argument".
And the majority of US Republican voters disagree with Mr Trump: according to a 2015 survey by the Pew Research Center, 56% believe undocumented immigrants should be allowed to stay if they meet certain criteria.
So who's right? And what would happen if US authorities attempted to carry out Mr Trump's audacious plan?
A huge task:
There are approximately 11.3 million undocumented immigrants in the US. Rounding them up and deporting them would present a huge logistical and financial challenge to America's military, law enforcement, and border control agencies.
Mr Trump hasn't set out a timeframe for his mass deportation strategy, but a 2015 study by the American Action Forum (AAF), a conservative think tank, estimates it would take about 20 years to find and deport that many people.
Using good old-fashioned American school buses, 650 would have to run every month, without a seat to spare, for two decades. Plus continuous operations from a variety of law enforcement and other government bodies - with all the cost that entails.
So how much is that? Based on an analysis for 5 million people, the Centre for American Progress estimates that a mass deportation from the US would cost an average of $10,070 (£6,624) per person. For 11.3 million people, that's $114bn (£75bn).
And that would cover only the basic operational costs - apprehension, detention, legal processing, and deportation. According to the AAF, the total cost of a 20-year mass deportation programme would be somewhere between $420 and $620 billion.
But we're not finished yet, there's still the impact on the economy. The AAF report, published earlier this year, estimates that undocumented immigrants made up 6.4% of the country's labour force - about 11 million workers - in 2014.
It predicts that deporting all of those workers would shrink the US economy by nearly 6%, or $1.6 trillion, by 2035.
That's not to mention the enormous potential for lawsuits and reparations claims filed against the government.
And what about... society?This massive deportation programme would have to be done with the support - or at least tacit consent - of the American people, many of whom will have lived and worked with, befriended and loved undocumented immigrants.
According to a 2013 study by Pew, illegal immigrant adults had been in the country for a median of 13 years at the time the study was carried out.
Would ordinary Americans turn a blind eye while neighbors, colleagues and friends were rounded up and taken away? Or would it precipitate mass civil unrest? In 2010, Arizona introduced a law that allowed police to check the legal status of anyone they suspected of being an illegal immigrant, and 100,000 people hit the streets to protest.
And then there is the thorny issue of how this would all look. In an age when nearly everyone has a video camera in their pocket, could soldiers really round people up - young and old, entire families - and force them on to buses and trains? Would the soldiers have machine guns and dogs? Could the average American stomach those images, with all their attendant historical echoes?
Are there any other options?The majority of US citizens - especially Hispanics, younger Americans and Democrats - support a path to either citizenship or permanent residency for undocumented immigrants.
Under plans first put forward by President Obama in 2014, about five million undocumented immigrants would have been allowed to apply for work permits and eventually permanent residency.
The program would have shielded immigrants who have been in the US since 2010, have not been convicted of a serious crime and have ties to US citizens.
Hillary Clinton, the Democratic candidate to succeed Mr Obama, supported the plan and pledged to expand it. Mr Trump made it clear he was firmly against the idea.
But Mr Obama's plan was rejected by Congress and then by the Supreme Court.
[End of BBC Article]
___________________________________________________________________________
New York Times (12/11/2018) Article "8 Million People Are Working Illegally in the U.S. Here’s Why That’s Unlikely to Change."
They make beds in inns across the country. They pick oranges in Florida, strawberries in California and vegetables in Ohio. And they have built new subdivisions in Phoenix, Atlanta and Charlotte.
For years, policymakers have talked about shutting off the influx of undocumented workers.
But the economy has grown to rely on them.
Ending illegal immigration, say many of those who have studied the issue, could mean that American workers would lose their jobs, companies would close and the economy would contract.
In recent years, though, border security has tightened considerably, a strong economy has driven down unemployment, and many employers, particularly those offering low-paid jobs, say there are few alternatives to hiring workers without legal documents.
President Trump, it turns out, is caught on both sides of the balance between border security and economic prosperity.
The president has vowed to erect a wall to keep out undocumented immigrants and has ramped up the deportation of those already in the United States. His administration has conducted payroll audits and workplace raids, which have resulted in the arrest of thousands of workers.
But four undocumented workers have recently come forward at the Trump National Golf Club in Bedminster, N.J., and the federal E-Verify database suggests that the Trump Organization does not use heightened employment document verification procedures at several other of its properties across the country, meaning that the chances of employing undocumented workers are high.
Like undocumented workers across the country, the former Bedminster employees interviewed by The New York Times said they used counterfeit Social Security and green cards to get hired.
The Trump Organization has vowed to terminate any undocumented workers it finds on its payroll, and the fate of any of its workers who do not have legal working papers remains unclear.
What is clear, however, is that at a time of extremely low unemployment, 3.7 percent nationally, Mr. Trump’s golf club might struggle to recruit legal workers to replace any undocumented workers who are terminated.
Most undocumented immigrants are in the labor forceAbout eight million of the nearly 11 million immigrants unlawfully in the United States — down from a high of 12.2 million in 2007 — participate in the labor force. They account for about 5 percent of all workers, according to the Pew Research Center.
“Our economy has absorbed these workers and employers would like more of them, given the low unemployment rate,” said Madeline Zavodny, an economist at the University of North Florida who is an expert on the economics of immigration.
Undocumented immigrants are over-represented in low-skilled jobs such as farming, construction and child care,e.g., unauthorized workers represent about 15 percent of those employed in construction.
Often, these are jobs their employers have trouble filling with American workers.
Anabele Garcia, an undocumented immigrant from Mexico, toils in the vineyards of Sonoma County in California, earning about $15 an hour. When the season ends each year, she finds work cleaning houses and wine estates, earning about $20 an hour. Her husband, Jorge Romero, works in the cow pastures nearby.
“We are here to do any work,” said Ms. Garcia, 39. “There are no Americans in the fields.”
Raising wages is not a catchall solutionWhat would happen if all the undocumented immigrants went away?
Steve Camarota, research director at the Center for Immigration Studies, which supports curbs on immigration, believes that wages would rise and motivate many chronically unemployed Americans to get back to work.
But wage rates are not the main issue, some economists say, because there still would not be enough Americans willing to do blue-collar jobs.
Expectations and status play a role, said Chris Tilly, a labor economist at the Luskin School of Public Affairs at the University of California, Los Angeles. “Not everybody will do dirty work,” he said.
They might prefer to make a low wage working inside an Amazon distribution center to putting shingles on a roof.
A survey conducted in late 2017 by the Associated Contractors of America found that 70 percent of construction companies were having difficulty hiring roofers, bricklayers and electricians, among others. The accommodation and food services sector reported a record number of vacancies this October.
A closed border could mean a shrinking economy:
Historically, the regulation of the border with Mexico, the main source of migration, “has always been driven by the needs of the economy,” Mr. Tilly said.
That’s less true now, under the Trump administration, which has sought to check illegal border crossings by all means possible.
Giovanni Peri, an economist who studies immigration labor at the University of California, Davis, said that with a true cutoff in illegal immigration, the economy would contract. The impact, he said, would fall not just on immigrants — because their work sustains sectors that employ many Americans.
“Some sectors, like construction, agriculture, housing and personal services would be drastically reduced,” Mr. Peri said. “There would be companies closing and relocating. There would be jobs lost. There will be towns and cities that would see half their population disappear.”
“It definitely would trigger a recession,” he said. “We are talking about a lot of job loss.”
It is very unlikely that weak, vulnerable American workers would benefit from jobs previously held by immigrants because some of these jobs themselves would disappear, he said.
Some Americans are unemployed for a reason:
“Very few of the jobs these immigrants have would be taken by these Americans,” Mr. Peri said. “The ones who are not employed have complicated circumstances like drug addiction, alcohol addiction or criminal records.”
Especially at a time of low unemployment, he added, “This would be the worst time to lose them. There are no unemployed Americans ready to do their jobs.”
In some sectors, “there might be people who would do these jobs at much higher wages,” said Ms. Zavodny, the economist in Florida. “But it is not clear those jobs would exist at much higher wages.”
Agriculture, construction and service would be hard-hit:
The agriculture industry has begun to invest in automation and robotics to compensate for a worsening labor shortage.
In the lettuce fields of California’s Salinas Valley, a new machine plies row after row of romaine lettuce, doing the backbreaking work, long performed by people, of lobbing heads of romaine lettuce from the field. It saves time and human labor.
Still, more than half of all field workers are undocumented, according to the Farm Bureau, which has said that their sudden disappearance would deal a catastrophic blow to American agriculture.
Since the 1990s, undocumented immigrants from Mexico and Central America have flocked to towns like Dalton, Ga., to work in the carpet mills. Across the South and in fast-growing cities like Denver, hundreds of thousands have been absorbed by the construction industry as roofers, painters and bricklayers.
Unauthorized immigrants in 2016 represented 10.6 percent of the labor force in Nevada, 8.6 percent in California and 8.2 percent in Texas, according to a study released last month by the Pew Research Center.
In states like Georgia and North Carolina, their presence has grown rapidly to represent 5.4 percent and 4.5 percent, respectively, of the labor force.
In all but four states, service occupations, such as being a waiter, dishwasher or maid, together draw the largest number of undocumented immigrants, the Pew report found.
About 31 percent of all undocumented immigrant workers were in service occupations in 2016, according to the estimates, which were based on data gathered by the Census Bureau.
Unauthorized immigrants represent about 24 percent of all workers in farming, fishing and forestry and 15 percent of those employed in construction, which is the industry that uses the most undocumented immigrant workers overall, at 1.35 million.
Nearly one quarter of restaurant workers in 2016 were foreign-born compared with 18.5 percent for all sectors, according to data from the Bureau of Labor Statistics, compiled by the National Restaurant Association. A large share are likely undocumented, economists say.
“These workers are often long-tenured and skilled,” said Craig Regelbrugge, senior vice president of industry advocacy and research at AmericanHort, which represents the nursery industry. “They are nothing short of vital to farms, businesses, and rural economies.”
“Each job they perform sustains two to three jobs in the surrounding economy, so even though few Americans seek this field and farm work, the jobs of many Americans and many communities are sustained by their contributions.”
[End of NY Times Article]
___________________________________________________________________________
Immigration policy of Donald Trump
Immigration policy and, specifically, illegal immigration to the United States, was a signature issue of U.S. President Donald Trump's presidential campaign, and his proposed reforms and remarks about this issue generated much publicity.
Trump has repeatedly said that some illegal immigrants are criminals. A 2015 study by the non-partisan Migration Policy Institute had concluded that around 820,000 unauthorized immigrants had criminal records; however, increasing evidence indicates that immigration does not correlate with higher crime rates.
A hallmark promise of his campaign was to build a substantial wall on the United States–Mexico border and to force Mexico to pay for the wall. Trump has also expressed support for a variety of "limits on legal immigration and guest-worker visas", including a "pause" on granting green cards, which Trump says will "allow record immigration levels to subside to more moderate historical averages".
Trump's proposals regarding H-1B visas frequently changed throughout his presidential campaign, but as of late July 2016, he appeared to oppose the H-1B visa program.
As president, Trump imposed a travel ban that prohibited issuing visas to citizens of seven largely-Muslim countries. In response to legal challenges he revised the ban twice, with his third version being upheld by the Supreme Court in June 2018.
He attempted to end the Deferred Action for Childhood Arrivals program, but a legal injunction has allowed the policy to continue while the matter is the subject of legal challenge. He imposed a "zero tolerance" policy to require the arrest of anyone caught illegally crossing the border, which resulted in separating children from their families.
Tim Cook and 58 other CEOs of major American companies warned of harm from Trump's immigration policy. The "zero tolerance" policy was reversed in June 2018, but multiple media reports of continued family separations were published in the first half of 2019.
In his first State of the Union address on January 30, 2018, Trump outlined his administration's four pillars for immigration reform:
(1) a path to citizenship for DREAMers;
(2) increased border security funding;
(3) ending the diversity visa lottery;
(4) restrictions on family-based immigration.
The Four Pillars reinforce Trump's campaign slogan to "Buy American, Hire American" and 2017 executive order by the same name, and tracks with previously outlined immigration policy priorities.
Click on any of the following blue hyperlinks for more about the Immigration Policies of Donald Trump:
- Background in business practices
- Positions on immigration
- Executive actions
- Travel ban and refugee suspension
- Increased immigration enforcement
- Phase out of DACA
- Cancellation of Temporary Protective Status
- Zero-tolerance policy and family separation on the Mexico border
- Changes to asylum policy
- "Public charge" restrictions on awarding Green cards
- Elimination of "Medical deferred action"
- Reorganization of Department of Homeland Security
- Legal and reports
- For-profit detention centers
- See also:
- Immigration to the United States
- Immigration reduction in the United States
- Central American Minors Program
- Deferred Action for Childhood Arrivals
- Immigration reform in the United States
- Mexico–United States barrier
- Trump wall
- United States Refugee Admissions Program (USRAP)
- Timeline of federal policy on immigration, 2017-2020 on Ballotpedia
Myths and facts about immigration to the United States
Excerpt from The New Colossus, as engraved at the Statue of Liberty, greeting millions of immigrants at Ellis Island:
- "Give me your tired, your poor,
- Your huddled masses yearning to breathe free,
- The wretched refuse of your teeming shore.
- Send these, the homeless, tempest-tost to me,
- I lift my lamp beside the golden door!"
Anti-immigration proponents in the United States make many claims about illegal immigration from Mexico that are based on exaggerations, misconceptions, myths and outright lies.
Some of their complaints may be true on a very small scale, but by and large, most of their assertions do not reflect the actual trends in illegal immigration into the U.S.
Further, those who oppose illegal immigration tend to focus their rhetoric on appeals to emotion with scant evidence, such as claiming illegal immigrants are "taking our jobs" and "threatening our security".
Their only "solution" seems to be, "kick out the wetbacks and build a wall" (which would actually cause major damage to the U.S. economy).
Click on any of the following blue hyperlinks for more about "The Myths and the Facts About Immigration to the United States":
- The myths
- "Illegal immigrants do not pay taxes"
- "Immigrants come here to get 'welfare'"
- "Immigrants send all their money back to their home countries"
- "Immigrants take jobs and opportunity away from Americans"
- "Immigrants are a drain on the U.S. economy"
- "Immigrants don’t want to learn English or become Americans"
- "Most immigrants cross the border illegally"
- "Weak U.S. border enforcement has led to high levels of illegal immigration"
- Illegal immigrants are the source of many communicable diseases
- "Illegal immigrants cause crime"
- "The government is not enforcing existing immigration laws"
- Problems that arise when blanket deportation is attempted
- In a nutshell
- See also:
U.S. Immigration and Customs Enforcement (ICE)
U.S. Immigration and Customs Enforcement, commonly known by its initials ICE, is an agency of the Department of Homeland Security tasked with enforcing the United States' immigration laws.
It should be noted that ICE does not patrol the US borders; that task is handled by its sister agency, the United States Customs and Border Protection. Instead, ICE is responsible for arresting, detaining, and deporting undocumented immigrants already within the United States.
Under the presidency of Donald Trump, the agency has become unreasonably aggressive in its task of removing unwanted individuals to the point that it has become responsible for an ongoing human rights fiasco.
It wasn't all Trump's fault, however; ICE's abuses against undocumented immigrants happened under President Obama's watch as well.
ICE was established in 2003 by the Homeland Security Act, which created the Department of Homeland Security and integrated the former Immigration and Naturalization Service into what is now ICE.
Abuses:
Family separation policy and child detention centers:
As part of the Trump Administration's zero-tolerance immigration policy, the family separation policy of migrant children from their parents was officially adopted from April 2018 to June 2018, although it was found that the practice had begun a year prior to the policy's official announcement.
As of December 2018, roughly 15,000 children are currently detained within U.S. custody.
Rape and sexual assault:
ICE has a long history of rape and sexual assault within the agency.
In April 2018, The Intercept reported they had obtained 1,224 complaints of sexual assault filed between 2010 and September 2017, but DHS officials indicated they had received roughly 33,000 complaints between 2010 and 2016 that had not been shared with The Intercept. In February 2019, government documents revealed that over 5,800 migrant children faced alleged sexual abuse while in ICE custody.
Inhumane conditions:
Another report from The Intercept found that at the ICE processing center in Adelanto, California, the staff had been ignoring medical emergencies, the food was infested with maggots, and detainees working in the kitchens were being paid only $1 a day.
A report from the DHS Office of Inspector General reported guards mocking survivors of attempted suicide as "suicide failures", inadequate sanitation and food services within the facility, medical staff neglecting detainees, and staff members refusing to allow religious practices, telephone access and visitation.
Another DHS report found that the facility in Newark, New Jersey had kitchens serving raw, spoiled and expired meat and storing moldy bread, leaking roofs in every housing unit, flat and dilapidated bed mattresses held together with tied sheets, and showers with mold, mildew and peeling paint.
Additionally, in response to hunger strikes by eleven detainees at the processing center in El Paso, Texas, ICE officers began force-feeding six of them with nasal tubes, resulting in nosebleeds and vomiting. Immigrant children at Shenandoah Valley Juvenile Center in Virginia have said that they were beaten, left in their cells naked, and strapped to chairs by the guards.
ICE regularly inflicts solitary confinement on detainees at the slightest provocation, confining people to tiny cells for 23 hours a day and letting them pace around inside what amounts to a cage for one hour.
A policy adviser at the DHS revealed ICE had been engaging in infliction punishment on gay detainees, transgender detainees, disabled detainees detainees with mental illnesses, and detainees reporting abuses committed by ICE staff, with only half the cases being for rule violations.
The UN has declared that solitary confinement in excess of 15 days or more should be banned, as it is a form of torture. Despite this, the International Consortium of Investigative Journalists analyzed more than 8,400 records of ICE using solitary confinement on detainees, finding that more than half of the cases were in excess of 15 days, 573 were in excess of 90 days, 32 stints lasted more than a year (!), and a whopping one-third of detainees placed in solitary were mentally ill (373 of whom had to be placed on suicide watch).
Deaths in detention:
Under the Freedom of Information Act, The New York Times and the American Civil Liberties Union obtained documents revealing 107 deaths occurring in ICE custody from October 2003 to 2007. These deaths included:
- a man who was left in an isolation for 13 hours following a head fracture,
- a man who committed suicide after being denied painkillers following a leg surgery after a motorcycle accident,
- a woman who did not receive treatment for a uterine fibroid tumor,
- a man who did not receive treatment for a kidney ailment,
- and a woman who was denied treatment for pancreatic cancer for several weeks.
Since 2007, that number has gone up to 188, with 22 deaths occurring in the last two years.
Of the 24 individuals to die during Trump's term in office, it has been concluded that at least six of those deaths were caused by the agency's failure to provide adequate medical care.
Trump has exacerbated the problem of deaths occurring in ICE custody. 2017, his first year in office, saw 12 detainees die in ICE custody, the greatest number since 2009, Obama's first year in office
Clara Long, a senior researcher for the Human Rights Watch has said that, "ICE has proven unable or unwilling to provide adequately for the health and safety of the people it detains. The Trump administration’s efforts to drastically expand the already-bloated immigration detention system will only put more people at risk."
Religious freedom violations:
See the main article on this topic: Freedom of religion
While the Trump administration touts its commitment to "religious freedom," both ICE and the Border Patrol frequently and intentionally violate the religious rights of their detainees.
Egregious examples of these abuses include:
- feeding nothing but pork sandwiches to a devout Muslim once every eight hours
- punishing Sikhs for wearing head coverings and forcing them to pray near toilets,
- and repeatedly denying prayer services and religious texts for Muslims on the justification that "This is Glades County!"
This conduct is not limited to Muslims and Sikhs; Christians as well are consistently denied access to religious materials, services, and clergy. There are also reports of guards mocking Christian detainees for praying.
These violations of fundamental rights are unnecessary for any security purpose and appear to be intended as a means of harassment and a means of pressuring detainees to give up on their case and accept immediate deportation rather than to wait out the results of their case.
The US government has previously acknowledged how important religion can be for prisoner morale, mental health, and rehabilitation; what ICE is doing is illegal under the Religious Land Use and Institutionalized Persons Act of 2000, which guarantees religious rights for all prisoners in the United States, whether documented or otherwise.
Incompetence and corruption:
In February 2019, the DHS Office of Inspector General reported that ICE had not been holding private contractors accountable for issues at their detention centers, issuing only two fines in 14,003 cases between October 2015 and June 2018 that included sexual assault, use of tear gas and so on.
ICE has also proven to be dangerously vindictive. When citizens in several North Carolina counties elected sheriffs who agreed to the the termination of 287(g) cooperation agreements with ICE that notify federal agents of the immigration status of detained individuals, ICE retaliated with a five-day crackdown in those counties' 200 people, 60 of whom ICE had previously held no interest in.
This was done to cause maximum chaos in order to punish the citizens and sheriffs of those counties for refusing to comply with ICE's draconian anti-immigrant measures. ICE also punishes detainees who have the temerity to cooperate in legal cases against them.
Such was the case of several Iraqi detainees, who were ACLU clients, in Arizona who were told they were about to be released, only for their guards to reveal mid-charade that they were actually only being sent to a different ICE facility. This was an intentional and cruel form of psychological punishment.
ICE's American police state:
Perhaps the most frightening aspect of ICE's enforcement procedures for everyday US citizens is their policy of launching mass raids against anyone who looks too brown to be a real American. This has been a fact of ICE since the mid-Bush administration.
In 2006 ICE started using the excuse of investigating "document fraud" to raid workplaces across the country with ridiculously heavily armed agents. President George W. Bush publicly announced his desire to purge America of all illegal immigrants, directly causing 2008 to be a record year for workplace raids, with 6,287 arrests.
Obama issued a series of executive orders to focus ICE on dangerous criminals, but Trump's "zero tolerance" policy reversed that to target potentially any illegal immigrant.
ICE raids are traumatic events for the communities they target, as ICE rushes in with military equipment and helicopters to whisk away hundreds of people, leaving jobs empty, families torn apart, and local organizations straining to return things to normality.
Although it seems logical that ICE should first go after dangerous criminals, the agency instead targets so-called "low hanging fruit": people without criminal records who visit ICE for routine check-ins in the hope of earning citizenship or testify against criminals in court. The effect of ICE raids is widespread fear among immigrant communities and a complete withdrawal from civil and social life.
ICE also runs roughshod over the law. In the wake of Trump's "zero tolerance" policy, ICE and CBP agents have taken the law into their own hands by ignoring and violating court orders, denying access to legal counsel, and forcing people to sign documents they have not read.
ICE also arrests DACA beneficiaries who are here legally and conducts unwarranted raids in defiance of the Fourth Amendment. Increasing aggressiveness in raiding has resulted in ICE casting a pall over immigrant and minority communities with the threat of "collateral arrests", or arrests of individuals not actually targeted by the raid but who happen to have the wrong skin color.
Even more absurdly, ICE has taken to arresting people while they are in the process of applying for citizenship, a counterproductive policy that discourages people from complying with the law.
Even legal immigrants are now afraid to visit government buildings, as they are afraid that the ICE agents loitering outside are there for them, too.
ICE uses intimidation tactics against US citizens associated with their targets in the hopes of circumventing constitutional rights and getting away with misrepresenting the law.
Widespread racial profiling and aggressive raiding has led to ICE mistakenly arresting almost 1,500 legal United States citizens as of 2018, with ICE agents ignoring and mocking detainees' rightful claims of citizenship.
See also:
- Crimes against humanity
- George W. Bush - The man who brought ICE into existence.
- Barack Obama - The man who continued its existence.
- Donald Trump - The man who expanded and intensified its brutality and extremism.
- TSA — another agency of the DHS.
Koch Family, including its Political Activities
- YouTube Video: Billionaire Charles Koch on fighting in the political arena
- YouTube Video: Koch brothers to intervene in 2020 GOP primaries to unseat Trump
- YouTube Video by Charles Koch: Political System 'Rigged,' But Not By Me
The Koch family is an American family engaged in business and most noted for their political activities (donating to libertarian, criminal justice reform, and Republican Party causes) and their control of Koch Industries, the second-largest privately owned company in the United States (with 2017 revenues of $100 billion).
The family business was started by Fred C. Koch, who developed a new cracking method for the refinement of heavy crude oil into gasoline. Fred's four sons litigated against each other over their interests in the business during the 1980s and 1990s.
By 2019, Charles G. Koch and the late David H. Koch, commonly referred to as the Koch brothers, were the only ones of Fred Koch's four sons still with Koch Industries. Charles and David Koch built a political network of conservative donors and the brothers funneled financial revenue into television and multi-media advertising.
Click on any of the following blue hyperlinks for more about the Koch Family: ___________________________________________________________________________
The political activities of the Koch brothers include the financial and political influence of Charles G. and David H. Koch (1940–2019) on United States politics. This influence is seen both directly and indirectly via various political and public policy organizations that were supported by the Koch brothers.
The Koch brothers are the sons of Fred C. Koch (1900–1967), who founded Koch Industries, the second-largest privately held company in the United States, of which they own 84% of the stock. Having bought out two other brothers' interests, they remain in control of the family business, the fortune which they inherited from their father, and the Koch family foundations.
The brothers have made significant financial contributions to libertarian and conservative think tanks and have donated primarily to Republican Party candidates running for office."
A network of like-minded donors organized by the Kochs pledged to spend $889 million from 2009–2016 and its infrastructure has been said by Politico to rival "that of the Republican National Committee."
They actively fund and support organizations that contribute significantly to Republican candidates, and in particular that lobby against efforts to expand government's role in health care and climate change mitigation. By 2010, they had donated more than $100 million to dozens of free-market and advocacy organizations.
In May 2019, the Kochs announced a major restructuring of their philanthropic efforts, stating that the Koch network will henceforth operate under the umbrella of Stand Together, a nonprofit focused on supporting community groups.
The stated priorities of the restructured Koch network include efforts aimed at increasing employment, addressing poverty and addiction, ensuring excellent education, building a stronger economy, and bridging divides and building respect.
Click on any of the following blue hyperlinks for more about the Koch Family Political Activities:
The family business was started by Fred C. Koch, who developed a new cracking method for the refinement of heavy crude oil into gasoline. Fred's four sons litigated against each other over their interests in the business during the 1980s and 1990s.
By 2019, Charles G. Koch and the late David H. Koch, commonly referred to as the Koch brothers, were the only ones of Fred Koch's four sons still with Koch Industries. Charles and David Koch built a political network of conservative donors and the brothers funneled financial revenue into television and multi-media advertising.
Click on any of the following blue hyperlinks for more about the Koch Family: ___________________________________________________________________________
The political activities of the Koch brothers include the financial and political influence of Charles G. and David H. Koch (1940–2019) on United States politics. This influence is seen both directly and indirectly via various political and public policy organizations that were supported by the Koch brothers.
The Koch brothers are the sons of Fred C. Koch (1900–1967), who founded Koch Industries, the second-largest privately held company in the United States, of which they own 84% of the stock. Having bought out two other brothers' interests, they remain in control of the family business, the fortune which they inherited from their father, and the Koch family foundations.
The brothers have made significant financial contributions to libertarian and conservative think tanks and have donated primarily to Republican Party candidates running for office."
A network of like-minded donors organized by the Kochs pledged to spend $889 million from 2009–2016 and its infrastructure has been said by Politico to rival "that of the Republican National Committee."
They actively fund and support organizations that contribute significantly to Republican candidates, and in particular that lobby against efforts to expand government's role in health care and climate change mitigation. By 2010, they had donated more than $100 million to dozens of free-market and advocacy organizations.
In May 2019, the Kochs announced a major restructuring of their philanthropic efforts, stating that the Koch network will henceforth operate under the umbrella of Stand Together, a nonprofit focused on supporting community groups.
The stated priorities of the restructured Koch network include efforts aimed at increasing employment, addressing poverty and addiction, ensuring excellent education, building a stronger economy, and bridging divides and building respect.
Click on any of the following blue hyperlinks for more about the Koch Family Political Activities:
- Background
- Political activity
- Organizations
- Issues and policy
- Immigration
- Jane Mayer article in The New Yorker
- See also:
- Campaign finance in the United States
- Citizen Koch, 2013 documentary film
- Koch Brothers Exposed, a documentary film about the political activities of the Koch brothers
- KochPAC, the Koch Industries Inc Political Action Committee
- Political finance
- "Koch brothers collected news and commentary". The Guardian.
- Inside the Koch Brothers' Toxic Empire. Rolling Stone. September 24, 2014.
- Is Charles Koch a closet liberal?
George Soros, including his Political Activities
- YouTube Video Who is George Soros? - BBC News
- YouTube Video George Soros: Donald Trump Will Fail And Markets Won't Do Well
- YouTube Video: Debunking the myths surrounding George Soros
George Soros, Hon FBA (born Schwartz György; August 12, 1930) is a Hungarian-American investor and philanthropist. As of February 2018, he had a net worth of $8 billion, having donated more than $32 billion to his philanthropic agency, Open Society Foundations.
Born in Budapest, Soros survived Nazi Germany-occupied Hungary and emigrated to the United Kingdom in 1947. He attended the London School of Economics, graduating with a bachelor's and eventually a master's degree in philosophy.
Soros began his business career by taking various jobs at merchant banks in the United Kingdom and then the United States, before starting his first hedge fund, Double Eagle, in 1969. Profits from his first fund furnished the seed money to start Soros Fund Management, his second hedge fund, in 1970. Double Eagle was renamed to Quantum Fund and was the principal firm Soros advised.
At its founding, Quantum Fund had $12 million in assets under management, and as of 2011 it had $25 billion, the majority of Soros's overall net worth.
Soros is known as "The Man Who Broke the Bank of England" because of his short sale of US$10 billion worth of pounds sterling, which made him a profit of $1 billion during the 1992 Black Wednesday UK currency crisis.
Based on his early studies of philosophy, Soros formulated an application of Karl Popper's General Theory of Reflexivity to capital markets, which he claims renders him a clear picture of asset bubbles and fundamental/market value of securities, as well as value discrepancies used for shorting and swapping stocks.
Soros is a well-known supporter of progressive and liberal political causes, to which he dispenses donations through his foundation, the Open Society Foundations. Between 1979 and 2011, he donated more than $11 billion to various philanthropic causes; by 2017, his donations "on civil initiatives to reduce poverty and increase transparency, and on scholarships and universities around the world" totaled $12 billion.
Soros influenced the collapse of communism in Eastern Europe in the late 1980s and early 1990s, and provided one of Europe's largest higher education endowments to the Central European University in his Hungarian hometown. His extensive funding of political causes has made him a "bugaboo of European nationalists".
Numerous American conservatives have promoted false claims that characterize Soros as a singularly dangerous "puppetmaster" behind a variety of alleged global plots, with The New York Times reporting that by 2018 these claims had "moved from the fringes to the mainstream" of Republican politics.
Click on any of the following blue hyperlinks for more about George Soros:
Born in Budapest, Soros survived Nazi Germany-occupied Hungary and emigrated to the United Kingdom in 1947. He attended the London School of Economics, graduating with a bachelor's and eventually a master's degree in philosophy.
Soros began his business career by taking various jobs at merchant banks in the United Kingdom and then the United States, before starting his first hedge fund, Double Eagle, in 1969. Profits from his first fund furnished the seed money to start Soros Fund Management, his second hedge fund, in 1970. Double Eagle was renamed to Quantum Fund and was the principal firm Soros advised.
At its founding, Quantum Fund had $12 million in assets under management, and as of 2011 it had $25 billion, the majority of Soros's overall net worth.
Soros is known as "The Man Who Broke the Bank of England" because of his short sale of US$10 billion worth of pounds sterling, which made him a profit of $1 billion during the 1992 Black Wednesday UK currency crisis.
Based on his early studies of philosophy, Soros formulated an application of Karl Popper's General Theory of Reflexivity to capital markets, which he claims renders him a clear picture of asset bubbles and fundamental/market value of securities, as well as value discrepancies used for shorting and swapping stocks.
Soros is a well-known supporter of progressive and liberal political causes, to which he dispenses donations through his foundation, the Open Society Foundations. Between 1979 and 2011, he donated more than $11 billion to various philanthropic causes; by 2017, his donations "on civil initiatives to reduce poverty and increase transparency, and on scholarships and universities around the world" totaled $12 billion.
Soros influenced the collapse of communism in Eastern Europe in the late 1980s and early 1990s, and provided one of Europe's largest higher education endowments to the Central European University in his Hungarian hometown. His extensive funding of political causes has made him a "bugaboo of European nationalists".
Numerous American conservatives have promoted false claims that characterize Soros as a singularly dangerous "puppetmaster" behind a variety of alleged global plots, with The New York Times reporting that by 2018 these claims had "moved from the fringes to the mainstream" of Republican politics.
Click on any of the following blue hyperlinks for more about George Soros:
- Early life and education
- Investment career
- Personal life
- Political involvement
- Conspiracy theories and threats
- Attempted assassination
- Political and economic views
- Wealth and philanthropy
- Honors and awards
- Publications and scholarship
- See also:
- Official website
- Open Society Foundations
- Institute for New Economic Thinking
- Column archives at Project Syndicate
- Column archives at The New York Review of Books
- Appearances on C-SPAN
- George Soros on Charlie Rose
- "George Soros collected news and commentary". The Guardian.
- "George Soros collected news and commentary". The New York Times.
- Forbes.com: George Soros
- NYTimes: George Soros
- Membership at the Council on Foreign Relations
- Forbes 400
- Scott Bessent, former chief investment officer of Soros Fund Management
- Quincy Institute for Responsible Statecraft
Whistleblower Protection Act of 1989
- YouTube Video: Pelosi announces formal impeachment inquiry, GOP reacts: watch live
- YouTube Video: Cuomo to Trump attorney: Quid pro quo isn't necessary for impeachment
- YouTube Video: Nicolle Wallace Fact Checks The President In Real Time | Deadline | MSNBC
White House Expected to Release Whistleblower Complaint. Here’s What We Know About It So Far.
By: ELLIOT HANNON @ Slate Magazine, SEPT 26, 20196:26 AM
It was the Washington Post and New York Times’ reporting on the whistleblower complaint about Trump’s call with the Ukrainian president that prompted the White House to release an approximate transcript of the call and, in turn, Democrats in Congress to launch a formal impeachment inquiry.
The seriousness of the situation accelerated events—and the story—past whistleblower’s account of the initial July call between Donald Trump and President Volodymyr Zelensky of Ukraine (see left figure in photograph above), but the contents of what the whistleblower saw and heard, and what the intelligence community inspector general discovered in response to the complaint, could provide vital context for the call, how the White House viewed it at the time, and what happens next.
After initially trying to suppress the complaint, the Trump administration is reportedly set to release it, potentially as early as Thursday. Congressional leaders and members of the intelligence committees were allowed review the complaint, which is still classified, Wednesday night.
Until we have text of the real thing, here’s what we know from the reporting so far about what’s in the complaint and what it means:
• The intelligence officer who filed the complaint did so not only because the official was concerned about what transpired on the call, but, according to the New York Times, was also alarmed by the unusual way the White House handled the internal record of the conversation. The Washington Post reports that at some point, the White House moved records of Trump’s communications with foreign officials from where they are normally kept onto a separate computer network.
• The whistleblower was not, however, personally on the July 25 call, nor did the complainant have access to the readout of the call before coming forward according to Michael Atkinson, the inspector general for the intelligence community.
• The whistleblower instead was made aware of the call by White House officials expressing concern that Trump, according to a Justice Department memo, “abused his authority or acted unlawfully in connection with foreign diplomacy.”
• In the complaint, the whistleblower said there were multiple White House officials who were witness to the call and could corroborate the allegations.
• The inspector general, a Trump appointee who elevated the complaint, interviewed these witnesses.
• The inspector general concluded from his fact-finding that Trump had potentially broken the law by soliciting what amounted to a foreign campaign contribution, which exposed the president “to serious national security and counterintelligence risks.”
Of all of the new data points on the whistleblower complaint, now that the White House’s version of the transcript is out in the open, the most intriguing bit of information is the whistleblower’s apparent concern about how the Trump administration was characterizing and memorializing the contents of the call. Were portions of the call intentionally omitted from the record? Or distorted? That will be an important line of inquiry when the full complaint is released.
[End of Slate Article]
___________________________________________________________________________
The Whistleblower Protection Act of 1989, 5 U.S.C. 2302(b)(8)-(9), Pub.L. 101-12 as amended, is a United States federal law that protects federal whistleblowers who work for the government and report the possible existence of an activity constituting a violation of law, rules, or regulations, or mismanagement, gross waste of funds, abuse of authority or a substantial and specific danger to public health and safety.
A federal agency violates the Whistleblower Protection Act if agency authorities take (or threaten to take) retaliatory personnel action against any employee or applicant because of disclosure of information by that employee or applicant.
Authorized Federal Agencies:
Legal Cases:
The U.S. Supreme Court, in the case of Garcetti v. Ceballos, 04-473, ruled in 2006 that government employees do not have protection from retaliation by their employers under the First Amendment of the Constitution when they speak pursuant to their official job duties.
The U.S. Merit Systems Protection Board (MSPB) uses agency lawyers in the place of administrative law judges to decide federal employees' whistleblower appeals. These lawyers, dubbed "attorney examiners," deny 98% of whistleblower appeals; the Board and the Federal Circuit Court of Appeals give great deference to their initial decisions, resulting in affirmance rates of 97% and 98%, respectively.
The most common characteristics for a court claim that are encompassed within the protection of the Act include: that the plaintiff is an employee or person covered under the specific statutory or common law relied upon for action, that the defendant is an employer or person covered under the specific statutory or common law relied upon for the action, that the plaintiff engaged in protected whistleblower activity, that the defendant knew or had knowledge that the plaintiff engaged in such activity, that there was retaliatory action taken against the one doing the whistleblowing and that the unfair treatment would not have occurred if the plaintiff hadn't brought to attention the activities.
Robert MacLean blew the whistle on the fact that the TSA had cut its funding for more air marshals. In 2009 MacLean, represented by the Government Accountability Project, challenged his dismissal at the Merit Systems Protection Board, on the grounds that "his disclosure of the text message was protected under the Whistleblower Protection Act of 1989, because he 'reasonably believe[d]' that the leaked information disclosed 'a substantial and specific danger to public health or safety'." MacLean won the case in a ruling of 7–2 in the Supreme Court in January 2015.
Whistleblower Protection Enhancement Act and Presidential Policy Directive 19:
President Barack Obama issued Presidential Policy Directive 19 (PPD-19), entitled "Protecting Whistleblowers with Access to Classified Information". According to the law signed by Obama on October 10, 2012, it is written that "this Presidential Policy Directive ensures that employees (1) serving in the Intelligence Community or (2) who are eligible for access to classified information can effectively report waste, fraud, and abuse while protecting classified national security information. It prohibits retaliation against employees for reporting waste, fraud, and abuse.
However, according to a report that the Committee on Homeland Security and Governmental Affairs submitted to accompany S. 743, "the federal whistleblowers have seen their protections diminish in recent years, largely as a result of a series of decisions by the United States Court of Appeals for the Federal Circuit, which has exclusive jurisdiction over many cases brought under the Whistleblower Protection Act (WPA).
Specifically, the Federal Circuit has accorded a narrow definition to the type of disclosure that qualifies for whistleblower protection. Additionally, the lack of remedies under current law for most whistleblowers in the intelligence community and for whistleblowers who face retaliation in the form of withdrawal of the employee's security clearance leaves unprotected those who are in a position to disclose wrongdoing that directly affects our national security." S. 743 would address these problems by restoring the original congressional intent of the WPA to adequately protect whistleblowers, by strengthening the WPA, and by creating new whistleblower protections for intelligence employees and new protections for employees whose security clearance is withdrawn in retaliation for having made legitimate whistleblower disclosures. S. 743 ultimately became Pub.L. 112-199 (S.Rep. 112-155).
Related legislation:
On July 14, 2014, the United States House of Representatives voted to pass the All Circuit Review Extension Act (H.R. 4197; 113th Congress), a bill that gives authority to federal employees who want to appeal their judgment to any federal court, and which allows whistleblowers to appeal to any U.S. Court of Appeals that has jurisdiction.
The bill would extend from three years after the effective date of the Whistleblower Protection Enhancement Act of 2012 (i.e., December 27, 2012), the period allowed for: (1) filing a petition for judicial review of Merit Systems Protection Board decisions in whistleblower cases, and (2) any review of such a decision by the Director of the Office of Personnel Management (OPM).
There is also a bill in its early stages called the FBI Whistleblower Protection Enhancement Act. This bill was introduced by Senators Grassley and Leahy in 2015. It would protect whistleblowers in the FBI more completely from reprisals by supervisors, by assigning administrative law judges to adjudicate the cases, and enabling FBI employees to appeal their decisions to the courts.
See also:
By: ELLIOT HANNON @ Slate Magazine, SEPT 26, 20196:26 AM
It was the Washington Post and New York Times’ reporting on the whistleblower complaint about Trump’s call with the Ukrainian president that prompted the White House to release an approximate transcript of the call and, in turn, Democrats in Congress to launch a formal impeachment inquiry.
The seriousness of the situation accelerated events—and the story—past whistleblower’s account of the initial July call between Donald Trump and President Volodymyr Zelensky of Ukraine (see left figure in photograph above), but the contents of what the whistleblower saw and heard, and what the intelligence community inspector general discovered in response to the complaint, could provide vital context for the call, how the White House viewed it at the time, and what happens next.
After initially trying to suppress the complaint, the Trump administration is reportedly set to release it, potentially as early as Thursday. Congressional leaders and members of the intelligence committees were allowed review the complaint, which is still classified, Wednesday night.
Until we have text of the real thing, here’s what we know from the reporting so far about what’s in the complaint and what it means:
• The intelligence officer who filed the complaint did so not only because the official was concerned about what transpired on the call, but, according to the New York Times, was also alarmed by the unusual way the White House handled the internal record of the conversation. The Washington Post reports that at some point, the White House moved records of Trump’s communications with foreign officials from where they are normally kept onto a separate computer network.
• The whistleblower was not, however, personally on the July 25 call, nor did the complainant have access to the readout of the call before coming forward according to Michael Atkinson, the inspector general for the intelligence community.
• The whistleblower instead was made aware of the call by White House officials expressing concern that Trump, according to a Justice Department memo, “abused his authority or acted unlawfully in connection with foreign diplomacy.”
• In the complaint, the whistleblower said there were multiple White House officials who were witness to the call and could corroborate the allegations.
• The inspector general, a Trump appointee who elevated the complaint, interviewed these witnesses.
• The inspector general concluded from his fact-finding that Trump had potentially broken the law by soliciting what amounted to a foreign campaign contribution, which exposed the president “to serious national security and counterintelligence risks.”
Of all of the new data points on the whistleblower complaint, now that the White House’s version of the transcript is out in the open, the most intriguing bit of information is the whistleblower’s apparent concern about how the Trump administration was characterizing and memorializing the contents of the call. Were portions of the call intentionally omitted from the record? Or distorted? That will be an important line of inquiry when the full complaint is released.
[End of Slate Article]
___________________________________________________________________________
The Whistleblower Protection Act of 1989, 5 U.S.C. 2302(b)(8)-(9), Pub.L. 101-12 as amended, is a United States federal law that protects federal whistleblowers who work for the government and report the possible existence of an activity constituting a violation of law, rules, or regulations, or mismanagement, gross waste of funds, abuse of authority or a substantial and specific danger to public health and safety.
A federal agency violates the Whistleblower Protection Act if agency authorities take (or threaten to take) retaliatory personnel action against any employee or applicant because of disclosure of information by that employee or applicant.
Authorized Federal Agencies:
- The Office of Special Counsel investigates federal whistleblower complaints. In October 2008, then-special counsel Scott Bloch resigned amid an FBI investigation into whether he obstructed justice by illegally deleting computer files following complaints that he had retaliated against employees who disagreed with his policies. Then-Senator Barack Obama made a campaign vow to appoint a special counsel committed to whistleblower rights. It was not until April 2011 that President Obama's appointee Carolyn Lerner was confirmed by the Senate. Today, the primary mission of OSC is to safeguard the merit system by protecting federal employees and applicants from prohibited personnel practices, especially reprisal for whistleblowing.
- The Merit Systems Protection Board, a quasi-judicial agency that adjudicates whistleblower complaints, uses appointed administrative law judges who often back the government. Since 2000, the board has ruled for whistleblowers just three times in 56 cases decided on their merits, according to a Government Accountability Project analysis. Obama appointed a new chairperson and vice chairperson with backgrounds as federal worker advocates, but Tom Devine of GAP says, "It's likely to take years for them to turn things around." Currently, this office works to protect the Merit System Principles and promote an effective Federal workforce free of Prohibited Personnel Practices.
- The Court of Appeals for the Federal Circuit was established under Article III of the Constitution on October 1, 1982. It is the only court empowered to hear appeals of whistleblower cases decided by the merit board, has been criticized by Senator Grassley (R-Iowa) and others in Congress for misinterpreting whistleblower laws and setting a precedent that is hostile to claimants. Between 1994 and 2010, the court had ruled for whistleblowers in only three of 203 cases decided on their merits, GAP's analysis found.
Legal Cases:
The U.S. Supreme Court, in the case of Garcetti v. Ceballos, 04-473, ruled in 2006 that government employees do not have protection from retaliation by their employers under the First Amendment of the Constitution when they speak pursuant to their official job duties.
The U.S. Merit Systems Protection Board (MSPB) uses agency lawyers in the place of administrative law judges to decide federal employees' whistleblower appeals. These lawyers, dubbed "attorney examiners," deny 98% of whistleblower appeals; the Board and the Federal Circuit Court of Appeals give great deference to their initial decisions, resulting in affirmance rates of 97% and 98%, respectively.
The most common characteristics for a court claim that are encompassed within the protection of the Act include: that the plaintiff is an employee or person covered under the specific statutory or common law relied upon for action, that the defendant is an employer or person covered under the specific statutory or common law relied upon for the action, that the plaintiff engaged in protected whistleblower activity, that the defendant knew or had knowledge that the plaintiff engaged in such activity, that there was retaliatory action taken against the one doing the whistleblowing and that the unfair treatment would not have occurred if the plaintiff hadn't brought to attention the activities.
Robert MacLean blew the whistle on the fact that the TSA had cut its funding for more air marshals. In 2009 MacLean, represented by the Government Accountability Project, challenged his dismissal at the Merit Systems Protection Board, on the grounds that "his disclosure of the text message was protected under the Whistleblower Protection Act of 1989, because he 'reasonably believe[d]' that the leaked information disclosed 'a substantial and specific danger to public health or safety'." MacLean won the case in a ruling of 7–2 in the Supreme Court in January 2015.
Whistleblower Protection Enhancement Act and Presidential Policy Directive 19:
President Barack Obama issued Presidential Policy Directive 19 (PPD-19), entitled "Protecting Whistleblowers with Access to Classified Information". According to the law signed by Obama on October 10, 2012, it is written that "this Presidential Policy Directive ensures that employees (1) serving in the Intelligence Community or (2) who are eligible for access to classified information can effectively report waste, fraud, and abuse while protecting classified national security information. It prohibits retaliation against employees for reporting waste, fraud, and abuse.
However, according to a report that the Committee on Homeland Security and Governmental Affairs submitted to accompany S. 743, "the federal whistleblowers have seen their protections diminish in recent years, largely as a result of a series of decisions by the United States Court of Appeals for the Federal Circuit, which has exclusive jurisdiction over many cases brought under the Whistleblower Protection Act (WPA).
Specifically, the Federal Circuit has accorded a narrow definition to the type of disclosure that qualifies for whistleblower protection. Additionally, the lack of remedies under current law for most whistleblowers in the intelligence community and for whistleblowers who face retaliation in the form of withdrawal of the employee's security clearance leaves unprotected those who are in a position to disclose wrongdoing that directly affects our national security." S. 743 would address these problems by restoring the original congressional intent of the WPA to adequately protect whistleblowers, by strengthening the WPA, and by creating new whistleblower protections for intelligence employees and new protections for employees whose security clearance is withdrawn in retaliation for having made legitimate whistleblower disclosures. S. 743 ultimately became Pub.L. 112-199 (S.Rep. 112-155).
Related legislation:
On July 14, 2014, the United States House of Representatives voted to pass the All Circuit Review Extension Act (H.R. 4197; 113th Congress), a bill that gives authority to federal employees who want to appeal their judgment to any federal court, and which allows whistleblowers to appeal to any U.S. Court of Appeals that has jurisdiction.
The bill would extend from three years after the effective date of the Whistleblower Protection Enhancement Act of 2012 (i.e., December 27, 2012), the period allowed for: (1) filing a petition for judicial review of Merit Systems Protection Board decisions in whistleblower cases, and (2) any review of such a decision by the Director of the Office of Personnel Management (OPM).
There is also a bill in its early stages called the FBI Whistleblower Protection Enhancement Act. This bill was introduced by Senators Grassley and Leahy in 2015. It would protect whistleblowers in the FBI more completely from reprisals by supervisors, by assigning administrative law judges to adjudicate the cases, and enabling FBI employees to appeal their decisions to the courts.
See also:
- Whistleblower protection in United States
- False Claims Act
- Federal crime
- Immunity from prosecution
- Informant
- List of whistleblowers
- Qui tam
- Testimony
- Turn state's evidence
- White collar crime
- Witness
- Witness intimidation
- United States Federal Witness Protection Program
- United States Marshals Service
Corruption, with a focus on the United States
- YouTube Video of Top 10 U.S. Political Scandals (by WatchMojo)
- YouTube Video: "Corruption is Legal in America" by RepresentUS
- YouTube Video: The Unprecedented Corruption Of President Donald Trump | All In | MSNBC
In general, corruption is a form of dishonesty or criminal activity undertaken by a person or organization entrusted with a position of authority, often to acquire illicit benefit, or, abuse of entrusted power for one's private gain.
Corruption may include many activities including bribery and embezzlement, though it may also involve practices that are legal in many countries. Political corruption occurs when an office-holder or other governmental employee acts in an official capacity for personal gain.
Corruption is most commonplace in kleptocracies, oligarchies, narco-states and mafia states.
Corruption can occur on different scales. Corruption ranges from small favors between a small number of people (petty corruption), to corruption that affects the government on a large scale (grand corruption), and corruption that is so prevalent that it is part of the everyday structure of society, including corruption as one of the symptoms of organized crime.
Corruption and crime are endemic sociological occurrences which appear with regular frequency in virtually all countries on a global scale in varying degree and proportion. Individual nations each allocate domestic resources for the control and regulation of corruption and crime. Strategies to counter corruption are often summarized under the umbrella term anti-corruption.
Click on any of the following blue hyperlinks for more about Global Corruption:
Corruption in the United States is the act of a local, state or federal official using some form of influence or being influenced in some way, typically through bribery.
Corruption is used to the advantage of the government official and a person or group of people.
The U.S. is the 18th least corrupt country in the world, according to the 2016 Corruption Perceptions Index by Transparency International.
Click on any of the following blue hyperlinks for more about Corruption in the United States:
Federal corruption convictions:
Political scandals and crimes:
See also:
Corruption may include many activities including bribery and embezzlement, though it may also involve practices that are legal in many countries. Political corruption occurs when an office-holder or other governmental employee acts in an official capacity for personal gain.
Corruption is most commonplace in kleptocracies, oligarchies, narco-states and mafia states.
Corruption can occur on different scales. Corruption ranges from small favors between a small number of people (petty corruption), to corruption that affects the government on a large scale (grand corruption), and corruption that is so prevalent that it is part of the everyday structure of society, including corruption as one of the symptoms of organized crime.
Corruption and crime are endemic sociological occurrences which appear with regular frequency in virtually all countries on a global scale in varying degree and proportion. Individual nations each allocate domestic resources for the control and regulation of corruption and crime. Strategies to counter corruption are often summarized under the umbrella term anti-corruption.
Click on any of the following blue hyperlinks for more about Global Corruption:
- Scales of corruption
- In different sectors
- Methods
- Corruption and economic growth
- Causes of corruption
- Anti-corruption programmes
- Corruption tourism
- Legal corruption
- Historical responses in philosophical and religious thought
- See also:
- Angolagate
- Academic careerism
- Accounting scandals
- Anti-globalization movement
- Appearance of corruption
- Attempted corruption
- Biens mal acquis
- Business ethics
- Conflict of interest
- Crony capitalism
- Corporate abuse
- Corporate Accountability International
- Corporate warfare
- CorpWatch
- Corporate crime
- Corporate malfeasance
- Corruption Perceptions Index
- Enron
- "Examines through attempted corruption"
- Federal Bureau of Investigation
- Graft (politics)
- Guanxi
- Goldman Sachs
- Investigative magistrate
- Industrial espionage
- Kaunas golden toilet case
- Kleptocracy
- Lobbying
- List of companies convicted of felony offenses in the United States
- Multinational Monitor
- Political corruption
- Penny stock scam
- Pump and dump
- Pay to play
- Regulatory capture
- Second economy of the Soviet Union
- Transparency International
- Trial in absentia
- United Nations Convention against Corruption
- Wasta
- Whistleblowers
- Organi-cultural Deviance
- Operation Car Wash
- ZTS OSOS
Corruption in the United States is the act of a local, state or federal official using some form of influence or being influenced in some way, typically through bribery.
Corruption is used to the advantage of the government official and a person or group of people.
The U.S. is the 18th least corrupt country in the world, according to the 2016 Corruption Perceptions Index by Transparency International.
Click on any of the following blue hyperlinks for more about Corruption in the United States:
- Campaign finance in the United States
- Gerrymandering in the United States
- Lobbying in the United States
- Voter suppression in the United States
- Operation Ill Wind
- Police corruption in New York City
- Vote early and vote often
Federal corruption convictions:
- List of United States federal officials convicted of corruption offenses
- List of United States state officials convicted of federal corruption offenses
- List of United States local officials convicted of federal corruption offenses
Political scandals and crimes:
- List of federal political scandals in the United States
- List of federal political sex scandals in the United States
- List of American federal politicians convicted of crimes
- Political scandals in the United States by state
- State and local political sex scandals in the United States
- List of American state and local politicians convicted of crimes
See also:
- Crime in the United States
- Corruption by country
- United States at the Business Anti-Corruption Portal
- United States at Transparency International
- United States at the Global Corruption Barometer
- Cole, Juan (December 3, 2013). "Top 10 Ways the US is the Most Corrupt Country in the World". Informed Comment. Archived from the original on November 29, 2017. Retrieved May 7, 2017.
(Political) Lobbying in the United States including Citizens United v. FEC and "End Citizens United"
TOP: HOW MUCH MONEY IS SPENT ON LOBBYING EACH YEAR ?
BOTTOM: Defense contractors such as Boeing and Lockheed Martin sell extensively to the government and must, of necessity, engage in lobbying to win contracts.
- YouTube Video: Corruption is Legal in America
- YouTube Video: Interest Groups: Crash Course Government and Politics
- YouTube Video: What You Probably Haven't Heard About Citizens United
TOP: HOW MUCH MONEY IS SPENT ON LOBBYING EACH YEAR ?
BOTTOM: Defense contractors such as Boeing and Lockheed Martin sell extensively to the government and must, of necessity, engage in lobbying to win contracts.
Your Webhost: no matter what your or my political beliefs, the corruption of our democracy can be expected whenever the politics of money prevails. Below, we cover three topics:
Lobbying in the United States describes paid activity in which special interests hire well-connected professional advocates, often lawyers, to argue for specific legislation in decision-making bodies such as the United States Congress. It is a highly controversial phenomenon, often seen in a negative light by journalists and the American public, with some critics describing it as a legal form of bribery or extortion.
While lobbying is subject to extensive and often complex rules which, if not followed, can lead to penalties including jail, the activity of lobbying has been interpreted by court rulings as constitutionally protected free speech and a way to petition the government for the redress of grievances, two of the freedoms protected by the First Amendment of the Constitution.
Since the 1970s, lobbying activity has grown immensely in the United States in terms of the numbers of lobbyists and the size of lobbying budgets, and has become the focus of much criticism of American governance.
Since lobby rules require extensive disclosure, there is a large amount of information in the public sphere about which entities lobby, how, at whom, and for how much. The current pattern suggests much lobbying is done primarily by corporations, although a wide variety of coalitions representing diverse groups also occurs. Lobbying takes place at every level of government, including federal, state, county, municipal, and even local governments.
In Washington, D.C., lobbying usually targets members of Congress, although there have been efforts to influence executive agency officials as well as Supreme Court appointments.
Lobbying can have an important influence on the political system; for example, a study in 2014 suggested that special interest lobbying enhanced the power of elite groups and was a factor shifting the nation's political structure toward an oligarchy in which average citizens have "little or no independent influence".
The number of lobbyists in Washington is estimated to be over twelve thousand, but most lobbying (in terms of expenditures), is handled by fewer than 300 firms with low turnover.
A report in The Nation in 2014 suggested that while the number of registered lobbyists in 2013 (12,281) decreased compared to 2002, lobbying activity was increasing and "going underground" as lobbyists use "increasingly sophisticated strategies" to obscure their activity. Analyst James A. Thurber estimated that the actual number of working lobbyists was close to 100,000 and that the industry brings in $9 billion annually.
Lobbying has been the subject of academic inquiry in various fields, including law, public policy, economics and even marketing strategy.
Overview:
Political scientist Thomas R. Dye once said that politics is about battling over scarce governmental resources: who gets them, where, when, why and how.
Since government makes the rules in a complex economy such as the United States, it is logical that various organizations, businesses, individuals, nonprofits, trade groups, religions, charities and others—which are affected by these rules—will exert as much influence as they can to have rulings favorable to their cause.
And the battling for influence has happened in every organized society since the beginning of civilization, whether it was Ancient Athens, Florence during the time of the Medici, Late Imperial China, or the present-day United States. Modern-day lobbyists in one sense are like the courtiers of the Ancien Régime. If voting is a general way for a public to control a government, lobbying is a more specific, targeted effort, focused on a narrower set of issues.
The term lobby has etymological roots in the physical structure of the British Parliament, in which there was an intermediary covered room outside the main hall. People pushing an agenda would try to meet with members of Parliament in this room, and they came to be known, by metonymy, as lobbyists, although one account in 1890 suggested that the application of the word "lobby" is American and that the term is not used as much in Britain.
The Willard Hotel, 2 blocks from the White House at 1401 Pennsylvania Avenue, claims the term originated there: "It was in the Willard lobby that Ulysses S. Grant popularized the term “lobbyist.” Often bothered by self-promoters as he sat in the lobby and enjoyed his cigar and brandy, he referred to these individuals as “lobbyists.”
The term lobbying in everyday parlance can describe a wide variety of activities, and in its general sense, suggests advocacy, advertising, or promoting a cause. In this sense, anybody who tries to influence any political position can be thought of as "lobbying", and sometimes the term is used in this loose sense. A person who writes a letter to a congressperson, or even questions a candidate at a political meeting, could be construed as being a lobbyist.
However, the term "lobbying" generally means a paid activity with the purpose of attempting to "influence or sway" a public official – including bureaucrats and elected officials – towards a desired specific action often relating to specific legislation. If advocacy is disseminating information, including attempts to persuade public officials as well as the public and media to promote the cause of something and support it, then when this activity becomes focused on specific legislation, either in support or in opposition, then it crosses the line from advocacy and becomes lobbying. This is the usual sense of the term "lobbying." One account suggested that much of the activity of nonprofits was not lobbying per se, since it usually did not mean changes in legislation.
A lobbyist, according to the legal sense of the word, is a professional, often a lawyer.
Lobbyists are intermediaries between client organizations and lawmakers: they explain to legislators what their organizations want, and they explain to their clients what obstacles elected officials face. One definition of a lobbyist is someone "employed to persuade legislators to pass legislation that will help the lobbyist's employer." Many lobbyists work in lobbying firms or law firms, some of which retain clients outside lobbying. Others work for advocacy groups, trade associations, companies, and state and local governments.
Lobbyists can be one type of government official, such as a governor of a state, who presses officials in Washington for specific legislation. A lobbyist may put together a diverse coalition of organizations and people, sometimes including lawmakers and corporations, and the whole effort may be considered to be a lobby; for example, in the abortion issue, there is a "pro-choice lobby" and a "pro-life lobby".
An estimate from 2007 reported that more than 15,000 federal lobbyists were based in Washington, DC; another estimate from 2011 suggested that the count of registered lobbyists who have actually lobbied was closer to 12,000. While numbers like these suggest that lobbying is a widespread activity, most accounts suggest that the Washington lobbying industry is an exclusive one run by a few well-connected firms and players, with serious barriers to entry for firms wanting to get into the lobbying business, since it requires them to have been "roaming the halls of Congress for years and years."
It is possible for foreign nations to influence the foreign policy of the United States through lobbying or by supporting lobbying organizations directly or indirectly. For example, in 2016, Taiwanese officials hired American senator-turned-lobbyist Bob Dole to set up a controversial phone call between president-elect Donald Trump and Taiwanese President Tsai Ing-Wen.
There are reports that the National Rifle Association, a U.S.-based lobbying group advocating for gun rights, has been the target of a decade-long infiltration effort by Russian president Vladimir Putin, with allegations that Putin funneled cash through the NRA to aid the election of Donald Trump. There are also reports that Qatar, Saudi Arabia, Bahrain and the United Arab Emirates has waged an intense lobbying campaign to win over the Trump administration and Congress.
Different types of lobbying:
The focus of lobbying efforts:
Generally, lobbyists focus on trying to persuade decision-makers: Congress, executive branch agencies such as the Treasury Department and the Securities and Exchange Commission, the Supreme Court, and state governments (including governors).
Federal agencies have been targeted by lobbyists since they write industry-specific rules; accordingly, interest groups spend "massive sums of money" trying to persuade them to make so-called "carve-outs" or try to block specific provisions from being enacted. A large fraction of overall lobbying is focused on only a few sets of issues, according to one report.
It is possible for one level of government to lobby another level; for example, the District of Columbia has been lobbying Congress and the President for greater power, including possible statehood or voting representation in Congress; one assessment in 2011 suggested that the district needed to rethink its lobbying strategy, since its past efforts have only had "mixed results".
Many executive branch agencies have the power to write specific rules and are a target of lobbying. Federal agencies such as the State Department make rules such as giving aid money to countries such as Egypt, and in one example, an Egyptian-American businessman named Kais Menoufy organized a lobby to try to halt U.S. aid to Egypt.
Since the Supreme Court has the power of judicial review and can render a congressional law unconstitutional, it has great power to influence the course of American life. For example, in the Roe v. Wade decision, it ruled on the legality of abortion. A variety of forces use lobbying tactics to pressure the court to overturn this decision.
Lobbyists represent their clients' or organizations' interests in state capitols. An example is a former school superintendent who has been lobbying state legislatures in California, Michigan and Nevada to overhaul teacher evaluations, and trying to end the "Last In, First Out" teacher hiring processes; according to one report, Michelle Rhee is becoming a "political force."
State governments can be lobbied by groups which represent other governments within the state, such as a city authority; for example, the cities of Tallahassee and St. Petersburg lobbied the Florida legislature using paid lobbyists to represent the city's interests. There is lobbying activity at the county and municipal levels, especially in larger cities and populous counties. For example, officials within the city government of Chicago called aldermen became lobbyists after serving in municipal government, following a one-year period required by city ethics rules to abstain from lobbying.
Paid versus free lobbying:
While the bulk of lobbying happens by business and professional interests who hire paid professionals, some lobbyists represent non-profits pro-bono for issues in which they are personally interested. Pro bono publico clients offer activities to meet and socialize with local legislators at events like fundraisers and awards ceremonies.
Single issue versus multiple issue lobbying:
Lobbies which push for a single issue have grown in importance during the past twenty years, according to one source. Corporations generally would be considered as single issue lobbies. If a corporation wishes to change public policy, or to influence legislation which impacts its success as a business, it may use lobbying as a "primary avenue" for this purpose. One research study suggested that single issue lobbies often operate in different kinds of institutional venues, sometimes bringing the same message to different groups.
Lobbies which represent groups such as labor unions, business organizations, trade associations and such are sometimes considered to be multiple issue lobbies, and to succeed they must be somewhat more flexible politically and be willing to accept compromise.
Inside versus outside lobbying:
History of lobbying:
Main article: History of lobbying in the United States
The Constitution was crafted in part to solve the problem of special interests, today usually represented by lobbies, by having these factions compete.
James Madison identified a faction as "a number of citizens, whether amounting to a minority or majority of the whole, who are united and actuated by some common impulse of passion, or of interest, adverse to the rights of other citizens, or to the permanent and aggregate interests of the community", and Madison argued in Federalist No. 10 that there was less risk of injury by a narrowly focused faction in a large republic if any negative influence was counteracted by other factions.
In addition, the Constitution protected free speech, including the right to petition the government, and these rights have been used by lobbying interests throughout the nation's history. There has been lobbying at every level of government, particularly in state governments during the nineteenth century, but increasingly directed towards the federal government in the twentieth century. The last few decades have been marked by an exponential increase in lobbying activity and expenditures.
Lobbying as a business:
Key players:
Lobbyists: The number of registered Washington lobbyists is substantial. In 2009, the Washington Post estimated that there were 13,700 registered lobbyists, describing the nation's Capitol as "teeming with lobbyists.".
In 2011, The Guardian estimated that in addition to the approximately 13,000 registered lobbyists, thousands more unregistered lobbyists could exist in Washington. The ratio of lobbyists employed by the healthcare industry, compared with every elected politician, was six to one, according to one account.
Nevertheless, the numbers of lobbyists actively engaged in lobbying is considerably less, and the ones occupied with lobbying full-time and making significant money is even less.
Law firms:
Several law firms, including Patton Boggs, Akin Gump and Holland & Knight, had sizable departments devoted to so-called "government relations". One account suggested that the lobbying arms of these law firms were not held as separate subsidiaries, but that the law practices involved in government lobbying were integrated into the overall framework of the law firm.
A benefit to an integrated arrangement was that the law firm and the lobbying department could "share and refer clients back and forth". Holland & Knight earned $13.9 million from lobbying revenue in 2011.
One law firm employs so-called "power brokers" including former Treasury department officials such as Marti Thomas, and former presidential advisers such as Daniel Meyer.
There was a report that two law firms were treating their lobbying groups as separate business units, and giving the non-lawyer lobbyists an equity stake in the firm.
Lobbying firms:
These firms usually have some lawyers in them, and are often founded by former congressional staffers, legislators, or other politicians. Some lobbying groups have been bought by large advertising conglomerates.
Corporations:
Corporations which lobby actively tend to be few in number, large, and often sell to the government. Most corporations do not hire lobbyists. One study found that the actual number of firms which do lobbying regularly is fewer than 300, and that the percent of firms engaged in lobbying was 10% from 1998 to 2006, and that they were "mainly large, rich firms getting in on the fun."
These firms hired lobbyists year after year, and there was not much evidence of other large firms taking much interest in lobbying. Corporations considering lobbying run into substantial barriers to entry: corporations have to research the relevant laws about lobbying, hire lobbying firms, and cultivate influential people and make connections.
When an issue regarding a change in immigration policy arose, large corporations currently lobbying switched focus somewhat to take account of the new regulatory world, but new corporations—even ones likely to be affected by any possible rulings on immigration—stayed out of the lobbying fray, according to the study.
Still, of all the entities doing lobbying in Washington, the biggest overall spenders are, in fact, corporations. In the first decade of the 2000s, the most lucrative clients for Gerald Cassidy's lobbying firm were corporations, displacing fees from the appropriations business.
Wall Street lobbyists and the financial industry spent upwards of $100 million in one year to "court regulators and lawmakers", particularly since they were "finalizing new regulations for lending, trading and debit card fees." One academic analysis in 1987 found that firms were more likely to spend on lobbying if they were both large and concerned about "adverse financial statement consequences" if they did not lobby.
Big banks were "prolific spenders" on lobbying; JPMorgan Chase has an in-house team of lobbyists who spent $3.3 million in 2010; the American Bankers Association spent $4.6 million on lobbying; an organization representing 100 of the nation's largest financial firms called the Financial Services Roundtable spent heavily as well. A trade group representing Hedge Funds spent more than $1 million in one quarter trying to influence the government about financial regulations, including an effort to try to change a rule that might demand greater disclosure requirements for funds.
Amazon.com spent $450,000 in one quarter lobbying about a possible online sales tax as well as rules about data protection and privacy. Corporations which sell substantially to the government tend to be active lobbiers. For example, aircraft manufacturer Boeing, which has sizeable defense contracts, pours "millions into lobbying":
"Boeing Co. is one of the most influential companies in airline manufacturing and has continually shown its influence in lobbying Congress ... Between January and September, Boeing spent a total of $12 million lobbying according to research by the Center for Responsive Politics.
Additionally, Boeing has its own political action committee, which donated more than $2.2 million to federal candidates during the 2010 election cycle. Of that sum, 53 percent went to Democrats. ...Through September, Boeing's PAC has donated $748,000 to federal politicians. — Chicago Sun-Times quoting OpenSecrets.org, 2011[
In the spring of 2017, there was a fierce lobbying effort by Internet service providers (ISPs) such as Comcast and AT&T, and tech firms such as Google and Facebook, to undo regulations protecting consumer privacy.
Rules passed by the Obama administration in 2016 required ISPs to get "explicit consent" from consumers before gathering browsing histories, locations of businesses visited and applications used, but trade groups wanted to be able to sell this information for profit without consent.
Lobbyists connected with Republican senator Jeff Flake and Republican representative Marsha Blackburn to sponsor legislation to dismantle Internet privacy rules; Flake received $22,700 in donations and Blackburn received $20,500 in donations from these trade groups. On March 23, 2017, abolition of privacy restrictions passed on a narrow party-line vote, and the lobbying effort achieved its result.
In 2017, credit reporting agency Equifax lobbied Congress extensively, spending $1.1 million in 2016 and $500,000 in 2017, seeking rules to limit damage from lawsuits and less regulatory oversight; in August 2017, Equifax's databases were breached and the confidential data of millions of Americans was stolen by hackers and identity thieves, potentially opening up the firm to numerous class action lawsuits.
Major American corporations spent $345 million lobbying for just three pro-immigration bills between 2006 and 2008.
Internet service providers in the United States have spent more than $1.2 billion on lobbying since 1998, and 2018 was the biggest year so far with a total spend of more than $80 million.
Unions
One report suggested the United Food & Commercial Workers International Union spent $80,000 lobbying the federal government on issues relating to "the tax code, food safety, immigration reform and other issues."
Other players:
Other possible players in the lobbying arena are those who might influence legislation: House & Senate colleagues, public opinion in the district, the White House, party leaders, union leaders, and other influential persons and groups. Interest groups are often thought of as "nonparty organizations" which regularly try to change or influence government decision-making.
Lobbying methods and techniques:
Lobbying has much in common with highly people-intensive businesses such as management consulting and public relations, but with a political and legal sensibility.
Like lawmakers, many lobbyists are lawyers, and the persons they are trying to influence have the duty of writing laws. That the disciplines of law and lobbying are intertwined could be seen in the case of a Texas lawyer who had been seeking compensation for his unfairly imprisoned client; since his exonerated-prisoner client had trouble paying the legal expenses, the lawyer lobbied the Texas state legislature to raise the state's payment for unfairly imprisoned prisoners from $50,000 per year to $80,000 per year; it succeeded, making it possible for his newly freed client to pay the lawyer's fees.
Well-connected lobbyists work in Washington for years, know the issues, are highly skilled advocates, and have cultivated close connections with members of Congress, regulators, specialists, and others. They understand strategy and have excellent communication skills; many are well suited to be able to choose which clients they would like to represent.
Lobbyists patiently cultivate networks of powerful people, over many years, trying to build trust and maintain confidence and friendships. When a client hires them to push a specific issue or agenda, they usually form coalitions to exert political pressure. Lobbying, as a result, depends on trying to be flexible to new opportunities, but at the same time, to act as an agent for a client.
As one lobbyist put it: "It's my job to advance the interests of my association or client. Period." — comment by a lobbyist.
Access is important and often means a one-on-one meeting with a legislator. Getting access can sometimes be difficult, but there are various avenues: email, personal letters, phone calls, face-to-face meetings, meals, get-togethers, and even chasing after congresspersons in the Capitol building:
"My style of lobbying is not to have big formal meetings, but to catch members on the fly as they're walking between the House and the office buildings." — a lobbyist commenting on access.
When getting access is difficult, there are ways to wear down the walls surrounding a legislator. Jack Abramoff explained:
Lobbyists often assist congresspersons with campaign finance by arranging fundraisers, assembling PACs, and seeking donations from other clients. Many lobbyists become campaign treasurers and fundraisers for congresspersons. This helps incumbent members cope with the substantial amounts of time required to raise money for reelection bids; one estimate was that congresspersons had to spend a third of their working hours on fundraising activity.
PACs are fairly easy to set up; it requires a lawyer and about $300, roughly. An even steeper possible reward which can be used in exchange for favors is the lure of a high-paying job as a lobbyist; according to Jack Abramoff, one of the best ways to "get what he wanted" was to offer a high-ranking congressional aide a high-paying job after they decided to leave public office.
When such a promise of future employment was accepted, according to Abramoff, "we owned them". This helped the lobbying firm exert influence on that particular congressperson by going through the staff member or aide.
At the same time, it is hard for outside observers to argue that a particular decision, such as hiring a former staffer into a lobbying position, was purely as a reward for some past political decision, since staffers often have valuable connections and policy experience needed by lobbying firms. Research economist Mirko Draca suggested that hiring a staffer was an ideal way for a lobbying firm to try to sway their old bosses—a congressperson—in the future.
Lobbyists, according to several sources, strive for communications which are clear, straightforward, and direct. In a one-on-one meeting with a lobbyist, it helps to understand precisely what goal is wanted. A lobbyist wants action on a bill; a legislator wants to be re-elected The idea is to persuade a legislator that what the lobbyist wants is good public policy. Lobbyists often urge lawmakers to try to persuade other lawmakers to approve a bill.
Still, persuasion is a subtle business, requiring a deft touch, and carelessness can boomerang. In one instance of a public relations reversal, a lobbying initiative by the Cassidy firm which targeted Senator Robert C. Byrd blew up when the Cassidy-Byrd connection was published in the Washington Post; this resulted in a furious Byrd reversing his previous pro-Cassidy position and throwing a "theatrical temper tantrum" regarding an $18 million facility. Byrd denounced "lobbyists who collect exorbitant fees to create projects and have them earmarked in appropriation bills... for the benefit of their clients."
Since it often takes a long time to build the network of relationships within the lobbying industry, ethical interpersonal dealings are important. A maxim in the industry is for lobbyists to be truthful with people they are trying to persuade; one lobbyist described it this way: "what you've basically got is your word and reputation".
An untruth, a lie is too risky to the successful development of a long-term relationship and the potential gain is not worth the risk. One report suggested that below-the-belt tactics generally do not work. One account suggest that groping for "personal dirt" on opponents was counterproductive since it would undermine respect for the lobbyist and their clients. And, by reverse logic, if an untruth is told by an opponent or opposing lobby, then it makes sense to publicize it.
But the general code among lobbyists is that unsubstantiated claims are bad business. Even worse is planting an informant in an opponent's camp, since if this subterfuge is ever discovered, it will boomerang negatively in a hundred ways, and credibility will drop to zero.
The importance of personal relationships in lobbying can be seen in the state of Illinois, in which father-son ties helped push a smart-grid energy bill, although there were accusations of favoritism. And there is anecdotal evidence that a business firm seeking to profitably influence legislation has to pay particular attention to which lobbyist it hires.
Strategic considerations for lobbyists, trying to influence legislation, include "locating a power base" or a constituency logically predisposed to support a given policy.
Timing, as well, is usually important, in the sense of knowing when to propose a certain action and having a big-picture view of the possible sequence of desired actions.
Strategic lobbying tries to estimate the possible responses of different groups to a possible lobby approach; one study suggested that the "expectations of opposition from other interests" was a key factor helping to determine how a lobby should operate.
Increasingly, lobbyists seek to put together coalitions and use outside lobbying by swaying public opinion. Bigger, more diverse and deep pocketed coalitions tend to be more effective in outside lobbying, and the "strength in numbers" principle often applies.
Interest groups try to build "sustainable coalitions of similarly situated individual organizations in pursuit of like-minded goals". According to one study, it is often difficult for a lobbyist to influence a staff member in Congress directly, since staffers tend to be well-informed and subject to views from competing interests.
As an indirect tactic, lobbyists can try to manipulate public opinion which, in turn, can sometimes exert pressure on congresspersons. Activities for these purposes include trying to use the mass media, cultivating contacts with reporters and editors, encouraging them to write editorials and cover stories to influence public opinion, which may have the secondary effect of influencing Congress.
According to analyst Ken Kollman, it is easier to sway public opinion than a congressional staff member since it is possible to bombard the public with "half-truths, distortion, scare tactics, and misinformation." Kollman suggests there should be two goals: (1) communicate that there is public support behind an issue to policymakers and (2) increase public support for the issue among constituents.
Kollman suggested outside lobbying was a "powerful tool" for interest group leaders. In a sense, using these criteria, one could consider James Madison as having engaged in outside lobbying, since after the Constitution was proposed, he wrote many of the 85 newspaper editorials arguing for people to support the Constitution, and these writings later became the Federalist Papers. As a result of this "lobbying" effort, the Constitution was ratified, although there were narrow margins of victory in four of the state legislatures.
Lobbying today generally requires mounting a coordinated campaign, using targeted blitzes of telephone calls, letters, emails to congressional lawmakers, marches down the Washington Mall, bus caravans, and such, and these are often put together by lobbyists who coordinate a variety of interest group leaders to unite behind a hopefully simple easy-to-grasp and persuasive message.
It is important for lobbyists to follow rules governing lobbying behavior. These can be difficult and complex, take time to learn, require full disclosure, and mistakes can land a lobbyist in serious legal trouble.
Gifts for congresspersons and staffers can be problematic, since anything of sizeable value must be disclosed and generally such gifts are illegal. Failure to observe gift restrictions was one factor which caused lobbyist Jack Abramoff to eventually plead guilty to a "raft of federal corruption charges" and led to convictions for 20 lobbyists and public officials, including congressperson Bob Ney and Bush deputy interior secretary Stephen Griles.
Generally gifts to congresspersons or their staffs or federal officials are not allowed, but with a few exceptions: books are permitted, provided that the inside cover is inscribed with the congressperson's name and the name of one's organization. Gifts under $5 are allowed.
Another exception is awards, so it is permitted to give a congressperson a plaque thanking him or her for support on a given issue. Cash gifts payable by check can only be made to campaign committees, not to a candidate personally or to his or her staff; it is not permitted to give cash or stock.
Wealthy lobbyists often encourage other lobbying clients to donate to a particular cause, in the hope that favors will be returned at a later date. Lobbyist Gerald Cassidy encouraged other clients to give for causes dear to a particular client engaged in a current lobbying effort.
Some lobbyists give their own money: Cassidy reportedly donated a million dollars on one project, according to one report, which noted that Cassidy's firm received "many times that much in fees from their clients" paid in monthly retainers. And their clients, in turn, had received "hundreds of millions in earmarked appropriations" and benefits worth "hundreds of millions more".
The dynamics of the lobbying world make it fairly easy for a semi-skilled operator to defraud a client. This is essentially what happened in the Jack Abramoff Indian lobbying scandal. There was a concerned client—in this case, an Indian casino—worried about possible ill-effects of legislation on its gambling business; and there were lobbyists such as Jack Abramoff who knew how to exploit these fears.
The lobbyists actively lobbied against their own casino-client as a way to ratchet up their fears of adverse legislation as well as stoke possible future contributions; the lobbyists committed other violations such as grossly overbilling their clients as well as violating rules about giving gifts to congresspersons.
Numerous persons went to jail after the scandal. The following are factors which can make fraud a fairly easy-to-do activity: that lobbyists are paid only to try to influence decision-makers, and may or may not succeed, making it hard to tell if a lobbyist did actual work; that much of what happens regarding interpersonal relations is obscure despite rather strict disclosure and transparency requirements; that there are sizable monies involved—factors such as these almost guarantee that there will be future scandals involving fraudulent lobbying activity, according to one assessment.
A fraud similar to Abramoff's was perpetrated in Maryland by lobbyist Gerard E. Evans, who was convicted of mail and wire fraud in 2000 in a case involving falsely creating a "fictitious legislative threat" against a client, and then billing the client to work against this supposed threat.
Lobbyists routinely monitor how congressional officials vote, sometimes checking the past voting records of congresspersons. One report suggested that reforms requiring "publicly recorded committee votes" led to more information about how congresspersons voted, but instead of becoming a valuable resource for the news media or voters, the information helped lobbyists monitor congressional voting patterns. As a general rule, lawmakers must vote as a particular interest group wishes them to vote, or risk losing support.
Strategy usually dictates targeting specific office holders. On the state level, one study suggested that much of the lobbying activity targeted the offices of governors as well as state-level executive bureaucrats; state lobbying was an "intensely personal game" with face-to-face contact being required for important decisions.
Lobbying can be a counteractive response to the lobbying efforts of others. One study suggested this was particularly true for battles surrounding possible decisions by the Supreme Court which is considered as a "battleground for public policy" in which differing groups try to "etch their policy preferences into law".
Sometimes there are lobbying efforts to slow or derail other legislative processes; for example, when the FDA began considering a cheaper generic version of the costly anti-clotting drug Lovenox, the French pharmaceutical firm Sanofi "sprang into action to try and slow the process." Lobbyists are often assembled in anticipation of a potential takeover bid, particularly when there are large high-profile companies, or a large foreign company involved, and substantial concern that the takeover may be blocked by regulatory authorities.
An example may illustrate. The company Tyco had learned that there had been discussion about a possible new tax provision that might have cost it $4 billion overall. So the firm hired Jack Abramoff and paid him a retainer of $100,000 a month. He assembled dozens of lobbyists with connections to key congressional committees with the ultimate objective being to influence powerful Senator Charles Grassley.
Abramoff began with a fundraising effort to round up "every check" possible. He sought funds from his other lobbying clients:
Lobbyists as educators and advisors:
Since government has grown increasingly complex, having to deal with new technologies, the task of writing rules has become more complex. "Government has grown so complex that it is a virtual certainty that more than one agency would be affected by any piece of legislation," according to one view.
Lobbyists, therefore, spend considerable time learning the ins and outs of issues, and can use their expertise to educate lawmakers and help them cope with difficult issues. Lobbyists' knowledge has been considered to be an intellectual subsidy for lawmakers. Some lobbyists become specialists with expertise in a particular set of issues, although one study suggested that of two competing criteria for lobbyists—expertise or access—that access was far more important.
Lobby groups and their members sometimes also write legislation and whip bills, and in these instances, it is helpful to have lawyers skilled in writing legislation to assist with these efforts. It is often necessary to research relevant laws and issues beforehand. In many instances lobbyists write the actual text of the proposed law, and hire lawyers to "get the language down pat"—an omission in wording or an unclear phrase may open up a loophole for opponents to wrangle over for years. And lobbyists can often advise a lawmaker on how to navigate the approval process.
Lobbying firms can serve as mentors and guides. For example, after months of protesting by the Occupy Wall Street, one lobbying firm prepared a memo to its clients warning that Republicans may "turn on big banks, at least in public" which may have the effect of "altering the political ground for years to come."
Here are parts of the memo which were broadcast on the MSNBC network: "Leading Democratic party strategists have begun to openly discuss the benefits of embracing the growing and increasingly organized Occupy Wall Street (OWS) movement ... This would mean more than just short-term discomfort for Wall Street firms. If vilifying the leading companies of this sector is allowed to become an unchallenged centerpiece of a coordinated Democratic campaign, it has the potential to have very long-lasting political, policy and financial impacts on the companies in the center of the bullseye. ... the bigger concern should be that Republicans will no longer defend Wall Street companies...
— Clark, Lytle, Geduldig, Cranford, law/lobbying firm, to a Wall Street client
A Growing Billion Dollar Business:
Since the 1970s, there has been explosive growth in the lobbying industry, particularly in Washington D.C.. By 2011, one estimate of overall lobbying spending nationally was $30+ billion dollars.[84] An estimate of lobbying expenses in the federal arena was $3.5 billion in 2010, while it had been only $1.4 billion in 1998. And there is prodigious data since firms are required to disclose lobbying expenditures on a quarterly basis.
The industry, however, is not immune to economic downturns. If Congress is gridlocked, such as during the summer and early fall of 2011, lobbying activity dipped considerably, according to The Washington Post. Lobbying firm Patton Boggs reported drops in revenue during that year, from $12 million in 2010 to $11 million in 2011. To cope with the downturn, some law firms compensated by increasing activity in litigation, regulatory work, and representing clients in congressional investigations.
A sea-change in government, such as a shift in control of the legislature from one political party to the other, can affect the lobbying business profoundly. For example, the primarily Democratic-serving lobbying firm Cassidy & Associates learned that control of Congress would change hands from Democrats to Republicans in 1994, and the firm acquired Republican lobbyists before the congressional handover of power, and the move helped the lobbying firm stay on top of the new political realities.
Examples of lobbying:
There are numerous examples of lobbying activity reported by the media. One report chronicled a somewhat unusual alliance of consumer advocates and industry groups to boost funding for the Food and Drug Administration; the general pattern of lobbying efforts had been to try to reduce the regulatory oversight of such an agency. In this case, however, lobbying groups wanted the federal watchdog agency to have tougher policing authority to avert expensive problems when oversight was lax; in this case, industry and consumer groups were in harmony, and lobbyists were able to persuade officials that higher FDA budgets were in the public interest.
Religious consortiums, according to one report, have engaged in a $400 million lobbying effort on such issues as the relation between church and state, civil rights for religious minorities, bioethics issues including abortion and capital punishment and end-of-life issues, and family issues.
Lobbying as a career:
While national-level lobbyists working in Washington have the highest salaries, many lobbyists operating at the state level can earn substantial salaries. The table shows the top lobbyists in one state--Maryland—in 2011.
Top power-brokers such as Gerald Cassidy have made fortunes from lobbying:
Cassidy's reaction to his own wealth has been complicated. He lives large, riding around town in his chauffeured car, spending thousands on custom-made clothes, investing big money in, for example, the Charlie Palmer Steak restaurant at the foot of Capitol Hill just for the fun of it. He has fashioned a wine cellar of more than 7,000 bottles. He loves to go to England and live like a gentleman of the kind his Irish antecedents would have considered an anathema.
— journalist Robert G. Kaiser in 2007 in the Washington Post.
Effectiveness of lobbying:
The general consensus view is that lobbying generally works overall in achieving sought-after results for clients, particularly since it has become so prevalent with substantial and growing budgets, although there are dissenting views.
A study by the investment-research firm Strategas which was cited in The Economist and the Washington Post compared the 50 firms that spent the most on lobbying relative to their assets, and compared their financial performance against that of the S&P 500 in the stock market; the study concluded that spending on lobbying was a "spectacular investment" yielding "blistering" returns comparable to a high-flying hedge fund, even despite the financial downturn of the past few years.
A 2009 study by University of Kansas professor Raquel Meyer Alexander suggested that lobbying brought a substantial return on investment.
A 2011 meta-analysis of previous research findings found a positive correlation between corporate political activity and firm performance.
There are numerous reports that the National Rifle Association or NRA successfully influenced 45 senators to block a proposed rule to regulate assault weapons, despite strong public support for gun control. The NRA spends heavily to influence gun policy; it gives $3 million annually to the re-election campaigns of congresspersons directly, and gives additional money to PACs and others to influence legislation indirectly, according to the BBC in 2016.
There is widespread agreement that a key ingredient in effective lobbying is money. This view is shared by players in the lobbying industry.
Deep pockets speak; the money trumps it all — Anonymous lobbyist, 2002
Still, effectiveness can vary depending on the situational context. One view is that large multiple-issue lobbies tend to be effective in getting results for their clients if they are sophisticated, managed by a legislative director familiar with the art of compromise, and play "political hardball".
But if such lobbies became too big, such as large industrial trade organizations, they became harder to control, often leading to lackluster results.
A study in 2001 which compared lobbying activity in US-style congressional against European-style parliamentary systems, found that in congressional systems there was an advantage favoring the "agenda-setters", but that in both systems, "lobbying has a marked effect on policies".
One report suggested that the 1,000 registered lobbyists in California were highly influential such that they were called the Third House.
Studies of lobbying by academics in previous decades painted a picture of lobbying being an ineffectual activity, although many of these studies were done before lobbying became prevalent in American politics.
A study in 1963 by Bauer, Pool, & Dexter suggested lobbyists were mostly "impotent" in exerting influence. Studies in the early 1990s suggested that lobbying exerted influence only "marginally", although it suggested that when lobbying activity did achieve political impacts, that the results of the political choices were sufficient to justify the expenditure on lobbying.
A fairly recent study in 2009 is that Washington lobbies are "far less influential than political rhetoric suggests", and that most lobbying campaigns do not change any views and that there was a strong entrenchment of the status quo.
But it depends on what is seen as "effective", since many lobbying battles result in a stalemate, since powerful interests battle, and in many cases, merely keeping the "status quo" could be seen as a victory of sorts. What happens often is that varying coalitions find themselves in "diametrical opposition to each other" and that stalemates result.
There is anecdotal evidence from numerous newspaper accounts of different groups battling that lobbying activity usually achieves results. For example, the Obama administration pledged to stop for-profit colleges from "luring students with false promises", but with this threat, the lobbying industry sprang into action with a $16 million campaign, and their efforts succeeded in watering down the proposed restrictions. How did the lobbying campaign succeed? Actions taken included:
And sometimes merely keeping the status quo could be seen as a victory. When gridlock led to the supposed supercommittee solution, numerous lobbyists from all parts of the political spectrum worked hard, and a stalemate resulted, but with each side defended their own special interests. And while money is an important variable, it is one among many variables, and there have been instances in which huge sums have been spent on lobbying only to have the result backfire.
One report suggested that the communications firm AT&T failed to achieve substantial results from its lobbying efforts in 2011, since government antitrust officials rejected its plan to acquire rival T-Mobile.
Lobbying is a practical necessity for firms that "live and die" by government decisions, such as large government contractors such as Boeing.
A study done in 2006 by Bloomberg News suggested that lobbying was a "sound money-making strategy" for the 20 largest federal contractors. The largest contractor, Lockheed Martin Corporation, received almost $40 billion in federal contracts in 2003-4, and spent $16 million on lobbying expenses and campaign donations. For each dollar of lobbying investment, the firm received $2,517 in revenues, according to the report.
When the lobbying firm Cassidy & Associates began achieving results with earmarks for colleges and universities and medical centers, new lobbying firms rose to compete with them to win "earmarks of their own", a clear sign that the lobbying was exceedingly effective.
Lobbying controversies:
Lobbying has been the subject of much debate and discussion. There is general consensus that lobbying has been a significant corrupting influence in American politics, although criticism is not universal, and there have been arguments put forward to suggest that the system is working properly.
Unfavorable image:
Generally the image of lobbyists and lobbying in the public sphere is not a positive one, although this is not a universal sentiment. Lobbyists have been described as a "hired gun" without principles or positions.
Scandals involving lobbying have helped taint the image of the profession, such as ones involving lobbyist Jack Abramoff, and congressmen Randy "Duke" Cunningham, and Bob Ney and others, and which featured words such as "bribery", "lobbyist", "member of Congress" and "prison" tending to appear together in the same articles.
Negative publicity can sully lobbying's image to a great extent:
There are a variety of reasons why lobbying has acquired a negative image in public consciousness. While there is much disclosure, much of it happens in hard-to-disclose personal meetings, and the resulting secrecy and confidentiality can serve to lower lobbying's status.
Revolving door:
Main article: Revolving door (politics)
Since the 1980s, congresspersons and staffers have been "going downtown"—becoming lobbyists—and the big draw is money. The "lucrative world of K Street" means that former congresspersons with even "modest seniority" can move into jobs paying $1 million or more annually, without including bonuses for bringing in new clients.
The general concern of this revolving-door activity is that elected officials—persons who were supposed to represent the interests of citizens—have instead become entangled with the big-money interests of for-profit corporations and interest groups with narrow concerns, and that public officials have been taken over by private interests.
In July 2005, Public Citizen published a report entitled "The Journey from Congress to K Street": the report analyzed hundreds of lobbyist registration documents filed in compliance with the Lobbying Disclosure Act and the Foreign Agents Registration Act among other sources. It found that since 1998, 43 percent of the 198 members of Congress who left government to join private life have registered to lobby.
A similar report from the Center for Responsive Politics found 370 former members were in the "influence-peddling business", with 285 officially registered as federal lobbyists, and 85 others who were described as providing "strategic advice" or "public relations" to corporate clients.
The Washington Post described these results as reflecting the "sea change that has occurred in lawmakers' attitudes toward lobbying in recent years." The report included a case study of one particularly successful lobbyist, Bob Livingston, who stepped down as Speaker-elect and resigned his seat in 1999.
In the six years since his resignation, The Livingston Group grew into the 12th largest non-law lobbying firm, earning nearly $40 million by the end of 2004. During roughly the same time period, Livingston, his wife, and his two political action committees (PACs) contributed over $500,000 to the campaign funds of various candidates.
Numerous reports chronicle the revolving door phenomenon. A 2011 estimate suggested that nearly 5,400 former congressional staffers had become federal lobbyists over a ten-year period, and 400 lawmakers made a similar jump. It is a "symbiotic relationship" in the sense that lobbying firms can exploit the "experience and connections gleaned from working inside the legislative process", and lawmakers find a "ready pool of experienced talent."
There is movement in the other direction as well: one report found that 605 former lobbyists had taken jobs working for lawmakers over a ten-year period. A study by the London School of Economics found 1,113 lobbyists who had formerly worked in lawmakers' offices.
The lobbying option is a way for staffers and lawmakers to "cash in on their experience", according to one view. Before the 1980s, staffers and aides worked many years for congresspersons, sometimes decades, and tended to stay in their jobs; now, with the lure of higher-paying lobbying jobs, many would quit their posts after a few years at most to "go downtown."
And it is not just staffers, but lawmakers as well, including high-profile ones such as congressperson Richard Gephardt. He represented a "working-class" district in Missouri for many years but after leaving Congress, he became a lobbyist. In 2007, he began his own lobbying firm called "Gephardt Government Affairs Group" and in 2010 it was earning close to $7 million in revenues with clients including Goldman Sachs, Boeing, Visa Inc., Ameren Corporation, and Waste Management Inc..
Senators Robert Bennett and Byron Dorgan became lobbyists too. Mississippi governor Haley Barbour became a lobbyist. In 2010, former representative Billy Tauzin earned $11 million running the drug industry's lobbying organization, called Pharmaceutical Research and Manufacturers of America (PhRMA).
Tauzin's bill to provide prescription drug access to Medicare recipients gave major concessions to the pharmaceutical industry: (1) Medicare was prevented from negotiating lower costs for prescription drugs (2) the reimportation of drugs from first world countries was not allowed (3) Medicare D was undermined by a policy of Medigap D.
After the bill passed a few months later, Tauzin retired from Congress and took an executive position at PhRMA to earn an annual salary of $2 million. Many former representatives earned over $1 million in one year, including James Greenwood and Daniel Glickman.
Insider's Game:
A similar concern voiced by critics of lobbying is that Washington politics has become dominated by elites, and that it is an "insider's game" excluding regular citizens and which favors entrenched firms.
Individuals generally can not afford to lobby, and critics question whether corporations with "deeper pockets" should have greater power than regular persons. In this view, the system favors the rich, such that the "rich have gotten richer, the weak weaker", admits lobbyist Gerald Cassidy.
There is concern that those having more money and better political connections can exert more influence than others. However, analyst Barry Hessenius made a case that the excessive for-profit lobbying could be counteracted if there were more efforts to increase nonprofit lobbying and boost their effectiveness. There is so much money that it has been described as a "flood" that has a "corrupting influence", so that the United States appears to be "awash" in interest groups.
If coalitions of different forces battle in the political arena for favorable treatment and better rules and tax breaks, it can be seen as fair if both sides have equal resources and try to fight for their interests as best they can.
Gerald Cassidy said: "In a lot of areas, the stakes are between big companies, and it's hard to argue that one solution is better than another solution with regard to the consumer's interest ... The issue ... is whether Company A's solution, or Company B's solution, based on their technology or their footprint, is the right one."
— Lobbyist Gerald Cassidy
A related but slightly different criticism is that the problem with lobbying as it exists today is that it creates an "inequity of access to the decision-making process". As a result, important needs get left out of the political evaluation, such that there are no anti-hunger lobbies or lobbies seeking serious solutions to the problem of poverty.
Nonprofit advocacy has been "conspicuously absent" from lobbying efforts, according to one view. Critics suggest that when a powerful coalition battles a less powerful one, or one which is poorly connected or underfunded, the result may be seen as unfair and potentially harmful for the entire society.
The increasing number of former lawmakers becoming lobbyists has led Senator Russ Feingold (D-WI) to propose paring back the many Capitol Hill privileges enjoyed by former senators and representatives. His plan would deprive lawmakers-turned-lobbyists of privileges such as unfettered access to otherwise "members only" areas such as the House and Senate floors and the House gym.
Choice-making problems:
A concern among many critics is that influence peddling hurts overall decision making, according to this criticism. Proposals with merit are dropped in favor of proposals backed by political expediency.
An example cited in the media is the recent battling between food industry lobbyists and healthcare lobbyists regarding school lunches. A group supported by the United States Department of Agriculture proposed healthier lunches as a way to combat childhood obesity by limiting the number of potatoes served, limiting salty foods, and adding more fresh vegetables, but this group was countered by a strong food lobby backed by Coca-Cola, Del Monte, and makers of frozen pizza.
The food lobbyists succeeded in blocking the proposed reforms, even writing rules suggesting that the tomato paste on a pizza qualified as a vegetable, but overall, according to critics, this case appeared to be an example where business interests won out over health concerns.
Critics use examples such as these to suggest that lobbying distorts sound governance. A study by IMF economists found that the "heaviest lobbying came from lenders making riskier loans and expanding their mortgage business most rapidly during the housing boom," and that there were indications that heavy-lobbying lenders were more likely to receive bailout funds. The study found a correlation between lobbying by financial institutions and excessive risk-taking during 2000–2007, and the authors concluded that "politically active lenders played a role in accumulation of risks and thus contributed to the financial crisis".
Another study suggested that governments tend to protect domestic industries, and have a habit of shunting monies to ailing sectors; the study suggested that "it is not that government policy picks losers, it is that losers pick government policy." One critic suggested that the financial industry has successfully blocked attempts at regulation in the aftermath of the 2008 financial collapse.
Governmental focus:
Critics have contended that when lawmakers are drawn into battles to determine issues such as the composition over school lunches or how much an ATM fee should be, more serious issues such as deficit reduction or global warming or social security are neglected. It leads to legislative inertia.
The concern is that the preoccupation with what are seen as superficial issues prevents attention to long-term problems. Critics suggested that the 2011 Congress spent more time discussing per-transaction debit-card fees while neglecting issues seen as more pressing.
Methodological problems:
In this line of reasoning, critics contend that lobbying, in and of itself, is not the sole problem, but only one aspect of a larger problem with American governance. Critics point to an interplay of factors: citizens being uninvolved politically; congresspersons needing huge sums of money for expensive television advertising campaigns; increased complexity in terms of technologies; congresspersons spending three days of every week raising money; and so forth.
Given these temptations, lobbying came along as a logical response to meet the needs of congresspersons seeking campaign funds and staffers seeking personal enrichment. In a sense, in competitive politics, the common good gets lost:
I know what my client wants; no one knows what the common good is.
— Anonymous lobbyist
A lobbyist can identify a client's needs. But it is hard for a single individual to say what is best for the whole group. The intent of the Constitution's Framers was to have built-in constitutional protections to protect the common good, but according to these critics, these protections do not seem to be working well:
Lawrence Lessig, a professor at Harvard Law School and author of Republic, Lost, suggested that the moneyed persuasive power of special interests has insinuated itself between the people and the lawmakers.
He quoted congressperson Jim Cooper who remarked that Congress had become a "Farm League for K Street" in the sense that congresspersons were focused on lucrative lobbying careers after Congress rather than on serving the public interest while in office. In a speech, Lessig suggested the structure of incentives was such that legislators were tempted to propose unnecessary regulations as a way to further lobbying industry activity.
According to one view, major legislation such as proposed Wall Street reforms have spurred demand for "participating in the regulatory process." Lessig suggested the possibility that it was not corporations deciding to take up lobbying, but Congress choosing to debate less-than-important issues to bring well-heeled corporations into the political fray as lobbyists.
As a result of his concerns, Lessig has called on state governments to summon a Second Constitutional Convention to propose substantive reform. Lessig believes that a constitutional amendment should be written to limit political contributions from non-citizens, including corporations, anonymous organizations, and foreign nationals.
Our current tax system with all its complexities is in part designed to make it easier for candidates, in particular congressmen, to raise money to get back to congress ... All sorts of special exceptions which expire after a limited period of time are just a reason to pick up the phone and call somebody and say 'Your exception is about to expire, here’s a good reason for you to help us fight to get it to extend.'
"And that gives them the opportunity to practice what is really a type of extortion – shaking the trees of money in the private sector into their campaign coffers so that they can run for congress again — Lawrence Lessig, 2011
Scholars such as Richard Labunski, Sanford Levinson, Glenn Reynolds, Larry Sabato as well as newspaper columnist William Safire, and activists such as John Booth of RestoringFreedom.org have called for constitutional changes that would curb the powerful role of money in politics.
Expansion of lobbying:
Law in the United States is generally made by Congress, but as the federal government has expanded during much of the twentieth century, there are a sizeable number of federal agencies, generally under the control of the president. These agencies write often industry-specific rules and regulations regarding such things as automobile safety and air quality.
Unlike elected congresspersons who are constantly seeking campaign funds, these appointed officials are harder to influence, generally. However, there are indications that lobbyists seek to expand their influence from the halls of Congress deeper into the federal bureaucracy.
President Obama pledged during the election campaign to rein in lobbying. As president in January 2009, he signed two executive orders and three presidential memoranda to help ensure his administration would be more open, transparent, and accountable. These documents attempted to bring increased accountability to federal spending and limit the influence of special interests, and included a lobbyist gift ban and a revolving door ban.
In May 2009, the Recovery Act Lobbying Rules. The Executive Branch Reform Act, H.R. 985, was a bill which would have required over 8,000 Executive Branch officials to report into a public database nearly any "significant contact" from any "private party." The purpose was to identify lobbying activity. The bill was supported by proponents as an expansion of "government in the sunshine" including groups such as Public Citizen.
But the proposals ran into serious opposition from various groups including the lobbying industry itself. Opponents argued that the proposed reporting rules would have infringed on the right to petition, making it difficult not just for lobbyists, but for regular citizens to communicate their views on controversial issues without having their names and viewpoints entered into a government database.
Opposition groups suggested that although the proposed rules were promoted as a way to regulate "lobbyists," persons described as a "private party" could be practically anybody, and that anybody contacting a federal official might be deemed to be a "lobbyist".
The U.S. Department of Justice raised constitutional and other objections to the bill. Opponents mobilized over 450 groups including the U.S. Chamber of Commerce and National Association of Realtors with letter writing campaigns against the proposed restrictions.
Lobbyist Howard Marlowe argued in a "stern letter" that the restriction on gift-giving to federal employees would create "fear of retribution for political donations":
In 2011, there were efforts to "shift regulatory power from the executive branch to Congress" by requiring that any "major rule" which may cost the economy more than $100 million must be decided by Congress with an up-or-down vote. But skeptics think that such a move proposed by Republican lawmakers could "usher in a lobbying bonanza from industry and other special-interest groups" to use campaign contributions to reshape the regulatory milieu.
Potential for reform:
Critics suggest that Congress has the power to fix itself, but is reluctant to sacrifice money and power. One report suggested that those in control had an "unbroken record of finding ways to navigate around reform laws or turn regulatory standards to their own advantage."
Arguments for lobbying:
There are counterarguments that the system is working as it should, despite being rather messy. According to this line of argument, the Madisonian view of politics—in which factions were supposed to compete with other factions—is working exactly as it should.
Competing factions, or in this case, competing interest groups, square off. Battling happens within the federal government, but instead of by settling arguments by elections, arguments are settled by powerful interest groups fighting each other, often financially. And it might appear to members of groups which lost in a lobbying battle that the reason for their loss was that the other side lobbied unfairly using more money.
There are numerous instances in which opposed lobbies stalemate, and instances in which these stalemates have been seen as a positive result. And sometimes powerful financial interests lose the battle.
Lobbying brings valuable information to policymakers, according to another argument in favor of lobbying. Since lobbyists often become highly knowledgeable about a specific issue by studying it in depth over years, they can bring considerable expertise to help legislators avoid errors as well as grasp the nuances of complex issues.
This information can also help Congress oversee numerous federal agencies which often regulate complex industries and issue highly detailed and specific rulings. Accordingly, it is difficult for Congress to keep track of what these agencies do. It has been argued that lobbyists can help Congress monitor this activity by possibly raising "red flags" about proposed administrative rulings.
Further, congresspersons can quickly gauge where they stand about a proposed administrative ruling simply by seeing which lobbying groups support the proposal, and which oppose it.
Another argument in support of lobbying is that different interest groups and lobbyists, while trying to build coalitions and win support, often amend or soften or change their positions in this process, and that interest groups and lobbyists regulate each other, in a sense.
But a more general sentiment supporting the lobbying arrangement is that every citizen can be construed as being "represented" by dozens of special interests:
Every citizen is a special interest... Blacks, consumers, teachers, pro-choicers, gun control advocates, handicapped people, aliens, exporters, and salesmen – are all special interests...
There is not an American today who is not represented (whether he or she knows it or not) by at least a dozen special interest groups. ... One person's special interest is another person's despotism...— Donald E. deKieffer, author of The Citizen's Guide to Lobbying Congress, 2007
If powerful groups such as the oil industry succeed in winning a battle in government, consumers who drive gas-powered cars can benefit a bit, according to this view. Even readers of Wikipedia could be conceived as being a special interest and represented by various lobbies.
For example, opponents of the Stop Online Piracy Act believed that the act might restrict sites such as Wikipedia; on January 18, 2012, as a form of protest and as a way to encourage readers and contributors of English Wikipedia to write their congresspersons, the online encyclopedia was "blacked out for a day as part of an effort to lobby the government.
Another view in support of lobbying is that it serves a helpful purpose as helping guard against extremism. According to this view, lobbying adds "built-in delays" and permits and encourages opposing lobbies to battle. In the battling, possibly damaging decrees and incorrect decisions are stymied by seemingly unhelpful delays and waits.
A slightly different view is that lobbying is no different from other professions: "Lobbying is no more perfect than is the practice of law or the practice of medicine."
— Lobbyist Gerald S. J. Cassidy, 2007
The regulatory environment:
Disclosure and domestic regulations:
Generally, the United States requires systematic disclosure of lobbying, and it may be one of the few countries to have such extensive requirements. Disclosure in one sense allows lobbyists and public officials to justify their actions under the banner of openness and with full compliance of the law.
The rules often specify how much a lobbyist can spend on specific activities, and how to report expenses; many of the laws and guidelines are specified in the Lobbying Disclosure Act of 1995.
Transparency and disclosure requirements mean that there are volumes of statistics available for all kinds of analyses—by journalists, by the public, by rival lobbying efforts. Researchers can subdivide lobbying expenditures by numerous breakdowns, such as by contributions from energy companies.
Sometimes defining clearly who is a "lobbyist" and what precisely are lobbying activities can be difficult. According to the Lobbying Disclosure Act, several authorized definitions include:
Still, distinguishing lobbyists from a strategic adviser can be difficult, since the duties of each can often overlap and are hard to define precisely.
There have been issues raised about what constitutes the difference between a lobbyist and a bundler; one report described bundlers as "supporters who contribute their own money to his campaign and solicit it from others", and there was a question whether such persons were really lobbyists involved with raising campaign monies for the election of Barack Obama, and whether Obama had broken his own pledge not to receive money from lobbyists.
The legal ramifications of lobbying are further intertangled with aspects of campaign finance reform, since lobbyists often spend time seeking donations for the reelection efforts of congresspersons; sorting out these issues can pose ethical challenges.
There are numerous regulations governing the practice of lobbying, often ones requiring transparency and disclosure. People paid to lobby must register with the secretary of the Senate and the clerk of the House of Representatives within 45 days of contacting a legislator for the first time, or 45 days after being employed.
An exception is that lobbyists who earn less than $3,000 per client for each fiscal quarter, or whose total lobbying expenses are less than $11,500 each quarter, do not need to register.
Part-time lobbyists are exempt from registering unless they spend more than 20% of their working hours doing lobbying activities in any quarter. If lobbyists have two or more contacts with a legislator as a lobbyist, then they must register.
Requirements for registering also apply to companies that specialize in lobbying, or ones that have an in-house lobbyist, particularly if they spend more than $11,500 on lobbying.
Generally, nonprofit organizations, other than churches, are exempt from registering if they hire an outside lobbying firm. Filing must be made each quarter, and a separate file is needed for each of the lobbyist's clients, and include information such as the name and title of the client, an estimate of lobbying expenses, and an estimate of income the lobbyist achieved after doing the lobbying.
States, in addition, are moving in the direction of greater disclosure and transparency regarding lobbying activities. California has an online database called Cal-Access although there were reports that it has been underfunded. Money collected from registration fees are often used to pay for the disclosure services such as Cal-Access.
There were complaints in Illinois that the disclosure requirements were often not rigorous enough and allowed lobbyists to work "without public notice" and with possible "conflicts of interest".
Many local municipalities are requiring legislative agents register as lobbyists to represent the interests of clients to local city council members such as in the swing state of Ohio cities such as Columbus and Cincinnati.
Laws requiring disclosure have been more prevalent in the twentieth century. In 1946, there was a so-called "sunshine law" requiring lobbyists to disclose what they were doing, on whose behalf, and how much they received in payment. The resulting Federal Regulation of Lobbying Act of 1946 governed lobbying rules up until 1995 when the Lobbying Disclosure Act replaced it.
The Federal Election Campaign Act of 1971, later amended in 2002 as the McCain Feingold Act, had rules governing campaign contributions. Each branch of Congress has rules as well. Legislation generally requires reports containing an accounting of major expenditures as well as legislation that was influenced; the wording of some of the pertinent laws can be found in 2 U.S.C. ch. 26.
Lobbying law is a constantly evolving field; the American Bar Association published a book of guidelines in 2009 with over 800 pages. The laws are often rather specific, and when not observed, can lead to serious trouble.
Failing to file a quarterly report, or knowingly filing an incorrect report, or failing to correct an incorrect report, can lead to fines up to $200,000 and imprisonment up to five years.
Penalties can apply to lobbyists who fail to list gifts made to a legislator. In other situations, the punishment can be light: for example, Congressional aide-turned-lobbyist Fraser Verrusio spent a few hours in jail after pleading guilty to taking a client to a World Series baseball game and failing to report it.
Tax rules can apply to lobbying. In one situation, the charity Hawaii Family Forum risked losing its tax-exempt status after it had engaged in lobbying activity; federal tax law requires charities such as that one to limit their lobbying to 20% of their overall expenditures or else be eligible for being taxed like a for-profit corporation.
Lobbyists sometimes support rules requiring greater transparency and disclosure:
"Our profession is at a critical point where we can either embrace the constructive changes and reforms by Congress or we can seek out loopholes and continue the slippery slide into history along side the ranks of snake oil salesmen" — Lobbyist Gerald S. J. Cassidy, 2007
Scandals can spur impetus towards greater regulation as well. The Jack Abramoff Indian lobbying scandal, which started in the 1990s and led to a guilty plea in 2006, inspired the Legislative Transparency and Accountability Act of 2006 (S. 2349). According to Time Magazine the Senate bill:
In 1995, the 104th Congress tried to reform Lobbying by passing the Lobbying Disclosure Act of 1995 which defines and requires lobbyists who are compensated for their actions to register with congressional officials. The legislation was later amended by the Lobbying Disclosure Technical Amendments Act of 1998.
There were subsequent modifications leading to the Honest Leadership and Open Government Act of 2007. The Lobbying Transparency and Accountability Act of 2006 (H.R. 4975) legislation modified Senate rules, although some senators and a coalition of good-government groups assailed the bill as being too weak.
The Honest Leadership and Open Government Act of 2007 was a comprehensive ethics and lobbying reform bill, (H.R. 2316), which passed in 2007 in the House and Congress by a large majority. A parallel Senate version of the legislation, (S. 1), passed in 2007 by a nearly unanimous vote. After the House & Senate resolved their differences and passed an amended revision, President Bush signed the enrolled bill into law (Pub.L. 110–81).
Some states have considered banning government employees permanently from lobbying on issues they had worked on. For example, there was a proposal along these lines to prevent county employees in Maryland from ever lobbying on issues they had worked on. The proposal insisted that county officials post financial disclosures as well as prohibit gifts from contractors.
Jack Abramoff, emerging from prison, has spoken publicly about lobbying. In his view, regulations designed to rein in the excesses of lobbying have not been effective, and reforms and regulations have not cleaned up the system "at all".
Abramoff said lobbyists could "find a way around just about any reform Congress enacted", and gave an example: You can't take a congressman to lunch for $25 and buy him a hamburger or a steak or something like that ... But you can take him to a fund-raising lunch and not only buy him that steak, but give him $25,000 extra and call it a fund-raiser – and have all the same access and all the same interactions with that congressman.
— Jack Abramoff, commenting on 60 Minutes, according to CNN
A similar view suggested that lobbying reform efforts have been "fought tooth and nail to prevent its passage" since the people with the power to reform would curtail their own powers and income flows.
Foreign lobbying:
Since commerce worldwide is becoming more integrated, with firms headquartered in one country increasingly doing business in many other countries, it is logical to expect that lobbying efforts will reflect the increasing globalization. Sometimes foreign-owned corporations will want to lobby the United States government, and in such instances, new rules can apply, since it can be particularly thorny resolving whether national security interests are at stake and how they might be affected.
In 1938, the Foreign Agents Registration Act required an explicit listing of all political activities undertaken by a lobbyist on behalf of any foreign principal. There were serious concerns about lobbying firms representing foreign entities – and potentially values opposed to American principles – after Axis power agitprop was planted in American soils during World War II through the efforts of public-relations specialist Ivy Lee's proxy firm "German Dye Trust".
As a result, in 1938, the Foreign Agents Registration Act or FARA was passed by Congress, and this law required foreign lobbyists to share information about their contracts with the Justice Department. FARA's mandate was to disclose to policymakers the sources of information that influenced public opinions, policies, and law.
However, the goal was not to restrict the speech of the lobbyist or the content of the lobbying. Nonetheless, it was estimated that less than half of foreign lobbyists who should have registered under FARA actually did so.
By the 1960s, perceived failures in FARA’s enforcement led to public outcry against lobbying excesses, while revelations of foreign bribery circulated regularly well into the early 1970s. This prompted legislation proposed to reduce the autonomy of foreign firms, most of which was not ratified for concerns over a lack of constitutionality.
While the House of Representatives passed a rule to increase public scrutiny of foreign lobbying, one estimate was that about 75% of lobbyists were exempt from a registration requirement, including individuals representing foreign interests.
A general trend is that the number of lobbyists representing foreign companies is rising. The case of Washington’s APCO Worldwide, a firm which represented the dictatorship of General Sani Abacha of Nigeria in 1995 whose regime had hanged nine pro-democracy activists, attracted negative publicity.
While current law forbids foreign nations from contributing to federal, state, or local elections, loopholes allow American subsidiaries of foreign corporations to establish so-called separated segregated funds or SSFs to raise money. According to one view, the definition of which firms are defined as "foreign" was unclear, and the lack of clarity undermines the ability to regulate their activity.
Foreign-funded lobbying efforts include those of Israel, Saudi Arabia, Turkey, Egypt, Pakistan, Libya, and China lobbies. In 2010, foreign governments spent approximately $460 million on lobbying Congress and the U.S. Government. Between 2015–2017, the Saudi Arabia paid $18 million to 145 registered lobbyists to influence the U.S. government.
While Congress has tried to quell criticisms against the leverage of domestic lobbying firms by updating domestic lobbying legislation – such as the revision of the Lobbyist Disclosure Act in 1997)—there was a report that its inaction in rectifying loopholes in foreign lobbying regulation has led to scandals. There was a report of an upsurge of lobbying by foreign-owned U.S. subsidiaries against Democratic efforts to limit campaign spending in early 2010.
The proposed was to restrict lobbying by U.S. subsidiaries of foreign firms. In 2011, the Chinese firm Alibaba hired a lobbying firm in Washington when it began contemplating a purchase of the U.S. firm Yahoo!. There was a case in which a lobbying effort described as "extraordinary" was trying to change the designation of a fringe Iranian opposition group from being a terrorist organization to being a benign organization.
Lobbyists seeking to downgrade the designation hired influential foreign affairs officials, including former CIA directors, a former FBI director, and others to advocate for the change of designation. But there have been others accused of illegally lobbying for foreign nations or who failed to register as a foreign agen who may face prison time as a result.
For more about Political Lobbying in the United States, click on any of the following blue hyperlinks:
Citizens United v. Federal Election Commission, 558 U.S. 310 (2010), is a landmark United States Supreme Court case concerning campaign finance.
The Court held that the free speech clause of the First Amendment prohibits the government from restricting independent expenditures for political communications by corporations, including nonprofit corporations, labor unions, and other associations.
The case arose after Citizens United, a conservative non-profit organization, sought to air and advertise a film critical of Democratic presidential candidate Hillary Clinton shortly before the 2008 Democratic primary elections.
This violated the 2002 Bipartisan Campaign Reform Act, which prohibited any corporation or labor union from making an "electioneering communication" within 30 days of a primary or 60 days of an election, or making any expenditure advocating the election or defeat of a candidate at any time.
In a majority opinion joined by four other justices, Associate Justice Anthony Kennedy held that the Bipartisan Campaign Reform Act's prohibition of all independent expenditures by corporations and unions violated the First Amendment's protection of free speech.
The Court overruled Austin v. Michigan Chamber of Commerce (1990), which had allowed different restrictions on speech-related spending based on corporate identity, as well as a portion of McConnell v. FEC (2003) that had restricted corporate spending on electioneering communications.
The ruling effectively freed labor unions and corporations to spend money on electioneering communications and to directly advocate for the election or defeat of candidates. In his dissenting opinion, Associate Justice John Paul Stevens argued that Court's ruling represented "a rejection of the common sense of the American people, who have recognized a need to prevent corporations from undermining self government."
The decision remains highly controversial, generating much public discussion and receiving strong support and opposition from various groups. Senator Mitch McConnell commended the decision, arguing that it represented "an important step in the direction of restoring the First Amendment rights".
By contrast, President Barack Obama stated that the decision "gives the special interests and their lobbyists even more power in Washington". The ruling had a major impact on campaign finance, allowing unlimited election spending by corporations and labor unions and fueling the rise of Super PACs. Later rulings by the Roberts Court, including McCutcheon v. FEC (2014), would strike down other campaign finance restrictions.
Click on any of the following blue hyperlinks for more about Citizens United v. Federal Election Commission Supreme Court Ruling:
End Citizens United (ECU) is a political action committee in the United States.
The organization is working to reverse the U.S. Supreme Court 2010 decision in Citizens United v. Federal Election Commission,(see above) which deregulated limits on independent expenditure group spending for (or against) specific candidates.
ECU is focused on driving larger campaign donations out of politics with a goal to elect "campaign-finance reform champions" to Congress by contributing and raising money for these candidates as well as running independent expenditures End Citizens United was founded in 2015, operating in its first election cycle during 2016 with more than $25 million in funding.
The organization has endorsed Democratic candidates such as Zephyr Teachout, Hillary Clinton, Russ Feingold, Beto O'Rourke, Elizabeth Warren and Jon Ossoff.
ECU was one of the largest outside groups funding the campaigns of U.S. Senators Maggie Hassan and Catherine Cortez Masto during the 2016 election, spending a combined $4.4 million on the races. End Citizens United announced that it had raised more than $7.5 million from grassroots donations by mid-2017, and planned to raise $35 million for the 2018 election cycle.
In the spring of 2018, an anonymous U.S.-based contractor paid at least 3,800 micro job workers to manipulate what stories would come up when people searched for the PAC via Google.
Click on any of the following blue hyperlinks for more about "End Citizens United":
- Lobbying in the United States
- Citizens United v. FEC Supreme Court Decision
- End Citizens United
Lobbying in the United States describes paid activity in which special interests hire well-connected professional advocates, often lawyers, to argue for specific legislation in decision-making bodies such as the United States Congress. It is a highly controversial phenomenon, often seen in a negative light by journalists and the American public, with some critics describing it as a legal form of bribery or extortion.
While lobbying is subject to extensive and often complex rules which, if not followed, can lead to penalties including jail, the activity of lobbying has been interpreted by court rulings as constitutionally protected free speech and a way to petition the government for the redress of grievances, two of the freedoms protected by the First Amendment of the Constitution.
Since the 1970s, lobbying activity has grown immensely in the United States in terms of the numbers of lobbyists and the size of lobbying budgets, and has become the focus of much criticism of American governance.
Since lobby rules require extensive disclosure, there is a large amount of information in the public sphere about which entities lobby, how, at whom, and for how much. The current pattern suggests much lobbying is done primarily by corporations, although a wide variety of coalitions representing diverse groups also occurs. Lobbying takes place at every level of government, including federal, state, county, municipal, and even local governments.
In Washington, D.C., lobbying usually targets members of Congress, although there have been efforts to influence executive agency officials as well as Supreme Court appointments.
Lobbying can have an important influence on the political system; for example, a study in 2014 suggested that special interest lobbying enhanced the power of elite groups and was a factor shifting the nation's political structure toward an oligarchy in which average citizens have "little or no independent influence".
The number of lobbyists in Washington is estimated to be over twelve thousand, but most lobbying (in terms of expenditures), is handled by fewer than 300 firms with low turnover.
A report in The Nation in 2014 suggested that while the number of registered lobbyists in 2013 (12,281) decreased compared to 2002, lobbying activity was increasing and "going underground" as lobbyists use "increasingly sophisticated strategies" to obscure their activity. Analyst James A. Thurber estimated that the actual number of working lobbyists was close to 100,000 and that the industry brings in $9 billion annually.
Lobbying has been the subject of academic inquiry in various fields, including law, public policy, economics and even marketing strategy.
Overview:
Political scientist Thomas R. Dye once said that politics is about battling over scarce governmental resources: who gets them, where, when, why and how.
Since government makes the rules in a complex economy such as the United States, it is logical that various organizations, businesses, individuals, nonprofits, trade groups, religions, charities and others—which are affected by these rules—will exert as much influence as they can to have rulings favorable to their cause.
And the battling for influence has happened in every organized society since the beginning of civilization, whether it was Ancient Athens, Florence during the time of the Medici, Late Imperial China, or the present-day United States. Modern-day lobbyists in one sense are like the courtiers of the Ancien Régime. If voting is a general way for a public to control a government, lobbying is a more specific, targeted effort, focused on a narrower set of issues.
The term lobby has etymological roots in the physical structure of the British Parliament, in which there was an intermediary covered room outside the main hall. People pushing an agenda would try to meet with members of Parliament in this room, and they came to be known, by metonymy, as lobbyists, although one account in 1890 suggested that the application of the word "lobby" is American and that the term is not used as much in Britain.
The Willard Hotel, 2 blocks from the White House at 1401 Pennsylvania Avenue, claims the term originated there: "It was in the Willard lobby that Ulysses S. Grant popularized the term “lobbyist.” Often bothered by self-promoters as he sat in the lobby and enjoyed his cigar and brandy, he referred to these individuals as “lobbyists.”
The term lobbying in everyday parlance can describe a wide variety of activities, and in its general sense, suggests advocacy, advertising, or promoting a cause. In this sense, anybody who tries to influence any political position can be thought of as "lobbying", and sometimes the term is used in this loose sense. A person who writes a letter to a congressperson, or even questions a candidate at a political meeting, could be construed as being a lobbyist.
However, the term "lobbying" generally means a paid activity with the purpose of attempting to "influence or sway" a public official – including bureaucrats and elected officials – towards a desired specific action often relating to specific legislation. If advocacy is disseminating information, including attempts to persuade public officials as well as the public and media to promote the cause of something and support it, then when this activity becomes focused on specific legislation, either in support or in opposition, then it crosses the line from advocacy and becomes lobbying. This is the usual sense of the term "lobbying." One account suggested that much of the activity of nonprofits was not lobbying per se, since it usually did not mean changes in legislation.
A lobbyist, according to the legal sense of the word, is a professional, often a lawyer.
Lobbyists are intermediaries between client organizations and lawmakers: they explain to legislators what their organizations want, and they explain to their clients what obstacles elected officials face. One definition of a lobbyist is someone "employed to persuade legislators to pass legislation that will help the lobbyist's employer." Many lobbyists work in lobbying firms or law firms, some of which retain clients outside lobbying. Others work for advocacy groups, trade associations, companies, and state and local governments.
Lobbyists can be one type of government official, such as a governor of a state, who presses officials in Washington for specific legislation. A lobbyist may put together a diverse coalition of organizations and people, sometimes including lawmakers and corporations, and the whole effort may be considered to be a lobby; for example, in the abortion issue, there is a "pro-choice lobby" and a "pro-life lobby".
An estimate from 2007 reported that more than 15,000 federal lobbyists were based in Washington, DC; another estimate from 2011 suggested that the count of registered lobbyists who have actually lobbied was closer to 12,000. While numbers like these suggest that lobbying is a widespread activity, most accounts suggest that the Washington lobbying industry is an exclusive one run by a few well-connected firms and players, with serious barriers to entry for firms wanting to get into the lobbying business, since it requires them to have been "roaming the halls of Congress for years and years."
It is possible for foreign nations to influence the foreign policy of the United States through lobbying or by supporting lobbying organizations directly or indirectly. For example, in 2016, Taiwanese officials hired American senator-turned-lobbyist Bob Dole to set up a controversial phone call between president-elect Donald Trump and Taiwanese President Tsai Ing-Wen.
There are reports that the National Rifle Association, a U.S.-based lobbying group advocating for gun rights, has been the target of a decade-long infiltration effort by Russian president Vladimir Putin, with allegations that Putin funneled cash through the NRA to aid the election of Donald Trump. There are also reports that Qatar, Saudi Arabia, Bahrain and the United Arab Emirates has waged an intense lobbying campaign to win over the Trump administration and Congress.
Different types of lobbying:
The focus of lobbying efforts:
Generally, lobbyists focus on trying to persuade decision-makers: Congress, executive branch agencies such as the Treasury Department and the Securities and Exchange Commission, the Supreme Court, and state governments (including governors).
Federal agencies have been targeted by lobbyists since they write industry-specific rules; accordingly, interest groups spend "massive sums of money" trying to persuade them to make so-called "carve-outs" or try to block specific provisions from being enacted. A large fraction of overall lobbying is focused on only a few sets of issues, according to one report.
It is possible for one level of government to lobby another level; for example, the District of Columbia has been lobbying Congress and the President for greater power, including possible statehood or voting representation in Congress; one assessment in 2011 suggested that the district needed to rethink its lobbying strategy, since its past efforts have only had "mixed results".
Many executive branch agencies have the power to write specific rules and are a target of lobbying. Federal agencies such as the State Department make rules such as giving aid money to countries such as Egypt, and in one example, an Egyptian-American businessman named Kais Menoufy organized a lobby to try to halt U.S. aid to Egypt.
Since the Supreme Court has the power of judicial review and can render a congressional law unconstitutional, it has great power to influence the course of American life. For example, in the Roe v. Wade decision, it ruled on the legality of abortion. A variety of forces use lobbying tactics to pressure the court to overturn this decision.
Lobbyists represent their clients' or organizations' interests in state capitols. An example is a former school superintendent who has been lobbying state legislatures in California, Michigan and Nevada to overhaul teacher evaluations, and trying to end the "Last In, First Out" teacher hiring processes; according to one report, Michelle Rhee is becoming a "political force."
State governments can be lobbied by groups which represent other governments within the state, such as a city authority; for example, the cities of Tallahassee and St. Petersburg lobbied the Florida legislature using paid lobbyists to represent the city's interests. There is lobbying activity at the county and municipal levels, especially in larger cities and populous counties. For example, officials within the city government of Chicago called aldermen became lobbyists after serving in municipal government, following a one-year period required by city ethics rules to abstain from lobbying.
Paid versus free lobbying:
While the bulk of lobbying happens by business and professional interests who hire paid professionals, some lobbyists represent non-profits pro-bono for issues in which they are personally interested. Pro bono publico clients offer activities to meet and socialize with local legislators at events like fundraisers and awards ceremonies.
Single issue versus multiple issue lobbying:
Lobbies which push for a single issue have grown in importance during the past twenty years, according to one source. Corporations generally would be considered as single issue lobbies. If a corporation wishes to change public policy, or to influence legislation which impacts its success as a business, it may use lobbying as a "primary avenue" for this purpose. One research study suggested that single issue lobbies often operate in different kinds of institutional venues, sometimes bringing the same message to different groups.
Lobbies which represent groups such as labor unions, business organizations, trade associations and such are sometimes considered to be multiple issue lobbies, and to succeed they must be somewhat more flexible politically and be willing to accept compromise.
Inside versus outside lobbying:
- Inside lobbying, or sometimes called direct lobbying, describes efforts by lobbyists to influence legislation or rule-making directly by contacting legislators and their assistants, sometimes called staffers or aides.
- Outside lobbying, or sometimes indirect lobbying, includes attempts by interest group leaders to mobilize citizens outside the policymaking community, perhaps by public relations methods or advertising, to prompt them to pressure public officials within the policymaking community. One example of an outside lobbying effort is a film entitled InJustice, made by a group promoting lawsuit reform. Some lobbyists are now are using social media to reduce the cost of traditional campaigns, and to more precisely target public officials with political messages.
History of lobbying:
Main article: History of lobbying in the United States
The Constitution was crafted in part to solve the problem of special interests, today usually represented by lobbies, by having these factions compete.
James Madison identified a faction as "a number of citizens, whether amounting to a minority or majority of the whole, who are united and actuated by some common impulse of passion, or of interest, adverse to the rights of other citizens, or to the permanent and aggregate interests of the community", and Madison argued in Federalist No. 10 that there was less risk of injury by a narrowly focused faction in a large republic if any negative influence was counteracted by other factions.
In addition, the Constitution protected free speech, including the right to petition the government, and these rights have been used by lobbying interests throughout the nation's history. There has been lobbying at every level of government, particularly in state governments during the nineteenth century, but increasingly directed towards the federal government in the twentieth century. The last few decades have been marked by an exponential increase in lobbying activity and expenditures.
Lobbying as a business:
Key players:
Lobbyists: The number of registered Washington lobbyists is substantial. In 2009, the Washington Post estimated that there were 13,700 registered lobbyists, describing the nation's Capitol as "teeming with lobbyists.".
In 2011, The Guardian estimated that in addition to the approximately 13,000 registered lobbyists, thousands more unregistered lobbyists could exist in Washington. The ratio of lobbyists employed by the healthcare industry, compared with every elected politician, was six to one, according to one account.
Nevertheless, the numbers of lobbyists actively engaged in lobbying is considerably less, and the ones occupied with lobbying full-time and making significant money is even less.
Law firms:
Several law firms, including Patton Boggs, Akin Gump and Holland & Knight, had sizable departments devoted to so-called "government relations". One account suggested that the lobbying arms of these law firms were not held as separate subsidiaries, but that the law practices involved in government lobbying were integrated into the overall framework of the law firm.
A benefit to an integrated arrangement was that the law firm and the lobbying department could "share and refer clients back and forth". Holland & Knight earned $13.9 million from lobbying revenue in 2011.
One law firm employs so-called "power brokers" including former Treasury department officials such as Marti Thomas, and former presidential advisers such as Daniel Meyer.
There was a report that two law firms were treating their lobbying groups as separate business units, and giving the non-lawyer lobbyists an equity stake in the firm.
Lobbying firms:
These firms usually have some lawyers in them, and are often founded by former congressional staffers, legislators, or other politicians. Some lobbying groups have been bought by large advertising conglomerates.
Corporations:
Corporations which lobby actively tend to be few in number, large, and often sell to the government. Most corporations do not hire lobbyists. One study found that the actual number of firms which do lobbying regularly is fewer than 300, and that the percent of firms engaged in lobbying was 10% from 1998 to 2006, and that they were "mainly large, rich firms getting in on the fun."
These firms hired lobbyists year after year, and there was not much evidence of other large firms taking much interest in lobbying. Corporations considering lobbying run into substantial barriers to entry: corporations have to research the relevant laws about lobbying, hire lobbying firms, and cultivate influential people and make connections.
When an issue regarding a change in immigration policy arose, large corporations currently lobbying switched focus somewhat to take account of the new regulatory world, but new corporations—even ones likely to be affected by any possible rulings on immigration—stayed out of the lobbying fray, according to the study.
Still, of all the entities doing lobbying in Washington, the biggest overall spenders are, in fact, corporations. In the first decade of the 2000s, the most lucrative clients for Gerald Cassidy's lobbying firm were corporations, displacing fees from the appropriations business.
Wall Street lobbyists and the financial industry spent upwards of $100 million in one year to "court regulators and lawmakers", particularly since they were "finalizing new regulations for lending, trading and debit card fees." One academic analysis in 1987 found that firms were more likely to spend on lobbying if they were both large and concerned about "adverse financial statement consequences" if they did not lobby.
Big banks were "prolific spenders" on lobbying; JPMorgan Chase has an in-house team of lobbyists who spent $3.3 million in 2010; the American Bankers Association spent $4.6 million on lobbying; an organization representing 100 of the nation's largest financial firms called the Financial Services Roundtable spent heavily as well. A trade group representing Hedge Funds spent more than $1 million in one quarter trying to influence the government about financial regulations, including an effort to try to change a rule that might demand greater disclosure requirements for funds.
Amazon.com spent $450,000 in one quarter lobbying about a possible online sales tax as well as rules about data protection and privacy. Corporations which sell substantially to the government tend to be active lobbiers. For example, aircraft manufacturer Boeing, which has sizeable defense contracts, pours "millions into lobbying":
"Boeing Co. is one of the most influential companies in airline manufacturing and has continually shown its influence in lobbying Congress ... Between January and September, Boeing spent a total of $12 million lobbying according to research by the Center for Responsive Politics.
Additionally, Boeing has its own political action committee, which donated more than $2.2 million to federal candidates during the 2010 election cycle. Of that sum, 53 percent went to Democrats. ...Through September, Boeing's PAC has donated $748,000 to federal politicians. — Chicago Sun-Times quoting OpenSecrets.org, 2011[
In the spring of 2017, there was a fierce lobbying effort by Internet service providers (ISPs) such as Comcast and AT&T, and tech firms such as Google and Facebook, to undo regulations protecting consumer privacy.
Rules passed by the Obama administration in 2016 required ISPs to get "explicit consent" from consumers before gathering browsing histories, locations of businesses visited and applications used, but trade groups wanted to be able to sell this information for profit without consent.
Lobbyists connected with Republican senator Jeff Flake and Republican representative Marsha Blackburn to sponsor legislation to dismantle Internet privacy rules; Flake received $22,700 in donations and Blackburn received $20,500 in donations from these trade groups. On March 23, 2017, abolition of privacy restrictions passed on a narrow party-line vote, and the lobbying effort achieved its result.
In 2017, credit reporting agency Equifax lobbied Congress extensively, spending $1.1 million in 2016 and $500,000 in 2017, seeking rules to limit damage from lawsuits and less regulatory oversight; in August 2017, Equifax's databases were breached and the confidential data of millions of Americans was stolen by hackers and identity thieves, potentially opening up the firm to numerous class action lawsuits.
Major American corporations spent $345 million lobbying for just three pro-immigration bills between 2006 and 2008.
Internet service providers in the United States have spent more than $1.2 billion on lobbying since 1998, and 2018 was the biggest year so far with a total spend of more than $80 million.
Unions
One report suggested the United Food & Commercial Workers International Union spent $80,000 lobbying the federal government on issues relating to "the tax code, food safety, immigration reform and other issues."
Other players:
Other possible players in the lobbying arena are those who might influence legislation: House & Senate colleagues, public opinion in the district, the White House, party leaders, union leaders, and other influential persons and groups. Interest groups are often thought of as "nonparty organizations" which regularly try to change or influence government decision-making.
Lobbying methods and techniques:
Lobbying has much in common with highly people-intensive businesses such as management consulting and public relations, but with a political and legal sensibility.
Like lawmakers, many lobbyists are lawyers, and the persons they are trying to influence have the duty of writing laws. That the disciplines of law and lobbying are intertwined could be seen in the case of a Texas lawyer who had been seeking compensation for his unfairly imprisoned client; since his exonerated-prisoner client had trouble paying the legal expenses, the lawyer lobbied the Texas state legislature to raise the state's payment for unfairly imprisoned prisoners from $50,000 per year to $80,000 per year; it succeeded, making it possible for his newly freed client to pay the lawyer's fees.
Well-connected lobbyists work in Washington for years, know the issues, are highly skilled advocates, and have cultivated close connections with members of Congress, regulators, specialists, and others. They understand strategy and have excellent communication skills; many are well suited to be able to choose which clients they would like to represent.
Lobbyists patiently cultivate networks of powerful people, over many years, trying to build trust and maintain confidence and friendships. When a client hires them to push a specific issue or agenda, they usually form coalitions to exert political pressure. Lobbying, as a result, depends on trying to be flexible to new opportunities, but at the same time, to act as an agent for a client.
As one lobbyist put it: "It's my job to advance the interests of my association or client. Period." — comment by a lobbyist.
Access is important and often means a one-on-one meeting with a legislator. Getting access can sometimes be difficult, but there are various avenues: email, personal letters, phone calls, face-to-face meetings, meals, get-togethers, and even chasing after congresspersons in the Capitol building:
"My style of lobbying is not to have big formal meetings, but to catch members on the fly as they're walking between the House and the office buildings." — a lobbyist commenting on access.
When getting access is difficult, there are ways to wear down the walls surrounding a legislator. Jack Abramoff explained:
- Access is vital in lobbying. If you can't get in your door, you can't make your case. Here we had a hostile senator, whose staff was hostile, and we had to get in. So that's the lobbyist safe-cracker method: throw fundraisers, raise money, and become a big donor. — Lobbyist Jack Abramoff in 2011
Lobbyists often assist congresspersons with campaign finance by arranging fundraisers, assembling PACs, and seeking donations from other clients. Many lobbyists become campaign treasurers and fundraisers for congresspersons. This helps incumbent members cope with the substantial amounts of time required to raise money for reelection bids; one estimate was that congresspersons had to spend a third of their working hours on fundraising activity.
PACs are fairly easy to set up; it requires a lawyer and about $300, roughly. An even steeper possible reward which can be used in exchange for favors is the lure of a high-paying job as a lobbyist; according to Jack Abramoff, one of the best ways to "get what he wanted" was to offer a high-ranking congressional aide a high-paying job after they decided to leave public office.
When such a promise of future employment was accepted, according to Abramoff, "we owned them". This helped the lobbying firm exert influence on that particular congressperson by going through the staff member or aide.
At the same time, it is hard for outside observers to argue that a particular decision, such as hiring a former staffer into a lobbying position, was purely as a reward for some past political decision, since staffers often have valuable connections and policy experience needed by lobbying firms. Research economist Mirko Draca suggested that hiring a staffer was an ideal way for a lobbying firm to try to sway their old bosses—a congressperson—in the future.
Lobbyists, according to several sources, strive for communications which are clear, straightforward, and direct. In a one-on-one meeting with a lobbyist, it helps to understand precisely what goal is wanted. A lobbyist wants action on a bill; a legislator wants to be re-elected The idea is to persuade a legislator that what the lobbyist wants is good public policy. Lobbyists often urge lawmakers to try to persuade other lawmakers to approve a bill.
Still, persuasion is a subtle business, requiring a deft touch, and carelessness can boomerang. In one instance of a public relations reversal, a lobbying initiative by the Cassidy firm which targeted Senator Robert C. Byrd blew up when the Cassidy-Byrd connection was published in the Washington Post; this resulted in a furious Byrd reversing his previous pro-Cassidy position and throwing a "theatrical temper tantrum" regarding an $18 million facility. Byrd denounced "lobbyists who collect exorbitant fees to create projects and have them earmarked in appropriation bills... for the benefit of their clients."
Since it often takes a long time to build the network of relationships within the lobbying industry, ethical interpersonal dealings are important. A maxim in the industry is for lobbyists to be truthful with people they are trying to persuade; one lobbyist described it this way: "what you've basically got is your word and reputation".
An untruth, a lie is too risky to the successful development of a long-term relationship and the potential gain is not worth the risk. One report suggested that below-the-belt tactics generally do not work. One account suggest that groping for "personal dirt" on opponents was counterproductive since it would undermine respect for the lobbyist and their clients. And, by reverse logic, if an untruth is told by an opponent or opposing lobby, then it makes sense to publicize it.
But the general code among lobbyists is that unsubstantiated claims are bad business. Even worse is planting an informant in an opponent's camp, since if this subterfuge is ever discovered, it will boomerang negatively in a hundred ways, and credibility will drop to zero.
The importance of personal relationships in lobbying can be seen in the state of Illinois, in which father-son ties helped push a smart-grid energy bill, although there were accusations of favoritism. And there is anecdotal evidence that a business firm seeking to profitably influence legislation has to pay particular attention to which lobbyist it hires.
Strategic considerations for lobbyists, trying to influence legislation, include "locating a power base" or a constituency logically predisposed to support a given policy.
Timing, as well, is usually important, in the sense of knowing when to propose a certain action and having a big-picture view of the possible sequence of desired actions.
Strategic lobbying tries to estimate the possible responses of different groups to a possible lobby approach; one study suggested that the "expectations of opposition from other interests" was a key factor helping to determine how a lobby should operate.
Increasingly, lobbyists seek to put together coalitions and use outside lobbying by swaying public opinion. Bigger, more diverse and deep pocketed coalitions tend to be more effective in outside lobbying, and the "strength in numbers" principle often applies.
Interest groups try to build "sustainable coalitions of similarly situated individual organizations in pursuit of like-minded goals". According to one study, it is often difficult for a lobbyist to influence a staff member in Congress directly, since staffers tend to be well-informed and subject to views from competing interests.
As an indirect tactic, lobbyists can try to manipulate public opinion which, in turn, can sometimes exert pressure on congresspersons. Activities for these purposes include trying to use the mass media, cultivating contacts with reporters and editors, encouraging them to write editorials and cover stories to influence public opinion, which may have the secondary effect of influencing Congress.
According to analyst Ken Kollman, it is easier to sway public opinion than a congressional staff member since it is possible to bombard the public with "half-truths, distortion, scare tactics, and misinformation." Kollman suggests there should be two goals: (1) communicate that there is public support behind an issue to policymakers and (2) increase public support for the issue among constituents.
Kollman suggested outside lobbying was a "powerful tool" for interest group leaders. In a sense, using these criteria, one could consider James Madison as having engaged in outside lobbying, since after the Constitution was proposed, he wrote many of the 85 newspaper editorials arguing for people to support the Constitution, and these writings later became the Federalist Papers. As a result of this "lobbying" effort, the Constitution was ratified, although there were narrow margins of victory in four of the state legislatures.
Lobbying today generally requires mounting a coordinated campaign, using targeted blitzes of telephone calls, letters, emails to congressional lawmakers, marches down the Washington Mall, bus caravans, and such, and these are often put together by lobbyists who coordinate a variety of interest group leaders to unite behind a hopefully simple easy-to-grasp and persuasive message.
It is important for lobbyists to follow rules governing lobbying behavior. These can be difficult and complex, take time to learn, require full disclosure, and mistakes can land a lobbyist in serious legal trouble.
Gifts for congresspersons and staffers can be problematic, since anything of sizeable value must be disclosed and generally such gifts are illegal. Failure to observe gift restrictions was one factor which caused lobbyist Jack Abramoff to eventually plead guilty to a "raft of federal corruption charges" and led to convictions for 20 lobbyists and public officials, including congressperson Bob Ney and Bush deputy interior secretary Stephen Griles.
Generally gifts to congresspersons or their staffs or federal officials are not allowed, but with a few exceptions: books are permitted, provided that the inside cover is inscribed with the congressperson's name and the name of one's organization. Gifts under $5 are allowed.
Another exception is awards, so it is permitted to give a congressperson a plaque thanking him or her for support on a given issue. Cash gifts payable by check can only be made to campaign committees, not to a candidate personally or to his or her staff; it is not permitted to give cash or stock.
Wealthy lobbyists often encourage other lobbying clients to donate to a particular cause, in the hope that favors will be returned at a later date. Lobbyist Gerald Cassidy encouraged other clients to give for causes dear to a particular client engaged in a current lobbying effort.
Some lobbyists give their own money: Cassidy reportedly donated a million dollars on one project, according to one report, which noted that Cassidy's firm received "many times that much in fees from their clients" paid in monthly retainers. And their clients, in turn, had received "hundreds of millions in earmarked appropriations" and benefits worth "hundreds of millions more".
The dynamics of the lobbying world make it fairly easy for a semi-skilled operator to defraud a client. This is essentially what happened in the Jack Abramoff Indian lobbying scandal. There was a concerned client—in this case, an Indian casino—worried about possible ill-effects of legislation on its gambling business; and there were lobbyists such as Jack Abramoff who knew how to exploit these fears.
The lobbyists actively lobbied against their own casino-client as a way to ratchet up their fears of adverse legislation as well as stoke possible future contributions; the lobbyists committed other violations such as grossly overbilling their clients as well as violating rules about giving gifts to congresspersons.
Numerous persons went to jail after the scandal. The following are factors which can make fraud a fairly easy-to-do activity: that lobbyists are paid only to try to influence decision-makers, and may or may not succeed, making it hard to tell if a lobbyist did actual work; that much of what happens regarding interpersonal relations is obscure despite rather strict disclosure and transparency requirements; that there are sizable monies involved—factors such as these almost guarantee that there will be future scandals involving fraudulent lobbying activity, according to one assessment.
A fraud similar to Abramoff's was perpetrated in Maryland by lobbyist Gerard E. Evans, who was convicted of mail and wire fraud in 2000 in a case involving falsely creating a "fictitious legislative threat" against a client, and then billing the client to work against this supposed threat.
Lobbyists routinely monitor how congressional officials vote, sometimes checking the past voting records of congresspersons. One report suggested that reforms requiring "publicly recorded committee votes" led to more information about how congresspersons voted, but instead of becoming a valuable resource for the news media or voters, the information helped lobbyists monitor congressional voting patterns. As a general rule, lawmakers must vote as a particular interest group wishes them to vote, or risk losing support.
Strategy usually dictates targeting specific office holders. On the state level, one study suggested that much of the lobbying activity targeted the offices of governors as well as state-level executive bureaucrats; state lobbying was an "intensely personal game" with face-to-face contact being required for important decisions.
Lobbying can be a counteractive response to the lobbying efforts of others. One study suggested this was particularly true for battles surrounding possible decisions by the Supreme Court which is considered as a "battleground for public policy" in which differing groups try to "etch their policy preferences into law".
Sometimes there are lobbying efforts to slow or derail other legislative processes; for example, when the FDA began considering a cheaper generic version of the costly anti-clotting drug Lovenox, the French pharmaceutical firm Sanofi "sprang into action to try and slow the process." Lobbyists are often assembled in anticipation of a potential takeover bid, particularly when there are large high-profile companies, or a large foreign company involved, and substantial concern that the takeover may be blocked by regulatory authorities.
An example may illustrate. The company Tyco had learned that there had been discussion about a possible new tax provision that might have cost it $4 billion overall. So the firm hired Jack Abramoff and paid him a retainer of $100,000 a month. He assembled dozens of lobbyists with connections to key congressional committees with the ultimate objective being to influence powerful Senator Charles Grassley.
Abramoff began with a fundraising effort to round up "every check" possible. He sought funds from his other lobbying clients:
- "I had my clients understand that just as other clients who had nothing to do with them, would step up and give contributions to congressmen they needed to have some sway with, so similarly they needed to do the same. I went to every client I could, and rounded up every check we could for him."
Lobbyists as educators and advisors:
Since government has grown increasingly complex, having to deal with new technologies, the task of writing rules has become more complex. "Government has grown so complex that it is a virtual certainty that more than one agency would be affected by any piece of legislation," according to one view.
Lobbyists, therefore, spend considerable time learning the ins and outs of issues, and can use their expertise to educate lawmakers and help them cope with difficult issues. Lobbyists' knowledge has been considered to be an intellectual subsidy for lawmakers. Some lobbyists become specialists with expertise in a particular set of issues, although one study suggested that of two competing criteria for lobbyists—expertise or access—that access was far more important.
Lobby groups and their members sometimes also write legislation and whip bills, and in these instances, it is helpful to have lawyers skilled in writing legislation to assist with these efforts. It is often necessary to research relevant laws and issues beforehand. In many instances lobbyists write the actual text of the proposed law, and hire lawyers to "get the language down pat"—an omission in wording or an unclear phrase may open up a loophole for opponents to wrangle over for years. And lobbyists can often advise a lawmaker on how to navigate the approval process.
Lobbying firms can serve as mentors and guides. For example, after months of protesting by the Occupy Wall Street, one lobbying firm prepared a memo to its clients warning that Republicans may "turn on big banks, at least in public" which may have the effect of "altering the political ground for years to come."
Here are parts of the memo which were broadcast on the MSNBC network: "Leading Democratic party strategists have begun to openly discuss the benefits of embracing the growing and increasingly organized Occupy Wall Street (OWS) movement ... This would mean more than just short-term discomfort for Wall Street firms. If vilifying the leading companies of this sector is allowed to become an unchallenged centerpiece of a coordinated Democratic campaign, it has the potential to have very long-lasting political, policy and financial impacts on the companies in the center of the bullseye. ... the bigger concern should be that Republicans will no longer defend Wall Street companies...
— Clark, Lytle, Geduldig, Cranford, law/lobbying firm, to a Wall Street client
A Growing Billion Dollar Business:
Since the 1970s, there has been explosive growth in the lobbying industry, particularly in Washington D.C.. By 2011, one estimate of overall lobbying spending nationally was $30+ billion dollars.[84] An estimate of lobbying expenses in the federal arena was $3.5 billion in 2010, while it had been only $1.4 billion in 1998. And there is prodigious data since firms are required to disclose lobbying expenditures on a quarterly basis.
The industry, however, is not immune to economic downturns. If Congress is gridlocked, such as during the summer and early fall of 2011, lobbying activity dipped considerably, according to The Washington Post. Lobbying firm Patton Boggs reported drops in revenue during that year, from $12 million in 2010 to $11 million in 2011. To cope with the downturn, some law firms compensated by increasing activity in litigation, regulatory work, and representing clients in congressional investigations.
A sea-change in government, such as a shift in control of the legislature from one political party to the other, can affect the lobbying business profoundly. For example, the primarily Democratic-serving lobbying firm Cassidy & Associates learned that control of Congress would change hands from Democrats to Republicans in 1994, and the firm acquired Republican lobbyists before the congressional handover of power, and the move helped the lobbying firm stay on top of the new political realities.
Examples of lobbying:
There are numerous examples of lobbying activity reported by the media. One report chronicled a somewhat unusual alliance of consumer advocates and industry groups to boost funding for the Food and Drug Administration; the general pattern of lobbying efforts had been to try to reduce the regulatory oversight of such an agency. In this case, however, lobbying groups wanted the federal watchdog agency to have tougher policing authority to avert expensive problems when oversight was lax; in this case, industry and consumer groups were in harmony, and lobbyists were able to persuade officials that higher FDA budgets were in the public interest.
Religious consortiums, according to one report, have engaged in a $400 million lobbying effort on such issues as the relation between church and state, civil rights for religious minorities, bioethics issues including abortion and capital punishment and end-of-life issues, and family issues.
Lobbying as a career:
While national-level lobbyists working in Washington have the highest salaries, many lobbyists operating at the state level can earn substantial salaries. The table shows the top lobbyists in one state--Maryland—in 2011.
Top power-brokers such as Gerald Cassidy have made fortunes from lobbying:
Cassidy's reaction to his own wealth has been complicated. He lives large, riding around town in his chauffeured car, spending thousands on custom-made clothes, investing big money in, for example, the Charlie Palmer Steak restaurant at the foot of Capitol Hill just for the fun of it. He has fashioned a wine cellar of more than 7,000 bottles. He loves to go to England and live like a gentleman of the kind his Irish antecedents would have considered an anathema.
— journalist Robert G. Kaiser in 2007 in the Washington Post.
Effectiveness of lobbying:
The general consensus view is that lobbying generally works overall in achieving sought-after results for clients, particularly since it has become so prevalent with substantial and growing budgets, although there are dissenting views.
A study by the investment-research firm Strategas which was cited in The Economist and the Washington Post compared the 50 firms that spent the most on lobbying relative to their assets, and compared their financial performance against that of the S&P 500 in the stock market; the study concluded that spending on lobbying was a "spectacular investment" yielding "blistering" returns comparable to a high-flying hedge fund, even despite the financial downturn of the past few years.
A 2009 study by University of Kansas professor Raquel Meyer Alexander suggested that lobbying brought a substantial return on investment.
A 2011 meta-analysis of previous research findings found a positive correlation between corporate political activity and firm performance.
There are numerous reports that the National Rifle Association or NRA successfully influenced 45 senators to block a proposed rule to regulate assault weapons, despite strong public support for gun control. The NRA spends heavily to influence gun policy; it gives $3 million annually to the re-election campaigns of congresspersons directly, and gives additional money to PACs and others to influence legislation indirectly, according to the BBC in 2016.
There is widespread agreement that a key ingredient in effective lobbying is money. This view is shared by players in the lobbying industry.
Deep pockets speak; the money trumps it all — Anonymous lobbyist, 2002
Still, effectiveness can vary depending on the situational context. One view is that large multiple-issue lobbies tend to be effective in getting results for their clients if they are sophisticated, managed by a legislative director familiar with the art of compromise, and play "political hardball".
But if such lobbies became too big, such as large industrial trade organizations, they became harder to control, often leading to lackluster results.
A study in 2001 which compared lobbying activity in US-style congressional against European-style parliamentary systems, found that in congressional systems there was an advantage favoring the "agenda-setters", but that in both systems, "lobbying has a marked effect on policies".
One report suggested that the 1,000 registered lobbyists in California were highly influential such that they were called the Third House.
Studies of lobbying by academics in previous decades painted a picture of lobbying being an ineffectual activity, although many of these studies were done before lobbying became prevalent in American politics.
A study in 1963 by Bauer, Pool, & Dexter suggested lobbyists were mostly "impotent" in exerting influence. Studies in the early 1990s suggested that lobbying exerted influence only "marginally", although it suggested that when lobbying activity did achieve political impacts, that the results of the political choices were sufficient to justify the expenditure on lobbying.
A fairly recent study in 2009 is that Washington lobbies are "far less influential than political rhetoric suggests", and that most lobbying campaigns do not change any views and that there was a strong entrenchment of the status quo.
But it depends on what is seen as "effective", since many lobbying battles result in a stalemate, since powerful interests battle, and in many cases, merely keeping the "status quo" could be seen as a victory of sorts. What happens often is that varying coalitions find themselves in "diametrical opposition to each other" and that stalemates result.
There is anecdotal evidence from numerous newspaper accounts of different groups battling that lobbying activity usually achieves results. For example, the Obama administration pledged to stop for-profit colleges from "luring students with false promises", but with this threat, the lobbying industry sprang into action with a $16 million campaign, and their efforts succeeded in watering down the proposed restrictions. How did the lobbying campaign succeed? Actions taken included:
- spent $16 million
- hired "all-star list" of prominent players including Democrats with White House ties
- plotted strategy
- worked with "fund-raising bundler" Jamie Rubin, a former Obama communications director
- won support from influential people including congressperson-turned-lobbyist Dick Gephardt, senator-turned-lobbyist John Breaux, lobbyist Tony Podesta, Washington Post CEO Donald E. Graham, education entrepreneur and University of Phoenix founder John Sperling, and others.
- key leaders made "impassioned appeals"
- mobilization effort produced 90,000 public documents to the Education department advocating against changes
And sometimes merely keeping the status quo could be seen as a victory. When gridlock led to the supposed supercommittee solution, numerous lobbyists from all parts of the political spectrum worked hard, and a stalemate resulted, but with each side defended their own special interests. And while money is an important variable, it is one among many variables, and there have been instances in which huge sums have been spent on lobbying only to have the result backfire.
One report suggested that the communications firm AT&T failed to achieve substantial results from its lobbying efforts in 2011, since government antitrust officials rejected its plan to acquire rival T-Mobile.
Lobbying is a practical necessity for firms that "live and die" by government decisions, such as large government contractors such as Boeing.
A study done in 2006 by Bloomberg News suggested that lobbying was a "sound money-making strategy" for the 20 largest federal contractors. The largest contractor, Lockheed Martin Corporation, received almost $40 billion in federal contracts in 2003-4, and spent $16 million on lobbying expenses and campaign donations. For each dollar of lobbying investment, the firm received $2,517 in revenues, according to the report.
When the lobbying firm Cassidy & Associates began achieving results with earmarks for colleges and universities and medical centers, new lobbying firms rose to compete with them to win "earmarks of their own", a clear sign that the lobbying was exceedingly effective.
Lobbying controversies:
Lobbying has been the subject of much debate and discussion. There is general consensus that lobbying has been a significant corrupting influence in American politics, although criticism is not universal, and there have been arguments put forward to suggest that the system is working properly.
Unfavorable image:
Generally the image of lobbyists and lobbying in the public sphere is not a positive one, although this is not a universal sentiment. Lobbyists have been described as a "hired gun" without principles or positions.
Scandals involving lobbying have helped taint the image of the profession, such as ones involving lobbyist Jack Abramoff, and congressmen Randy "Duke" Cunningham, and Bob Ney and others, and which featured words such as "bribery", "lobbyist", "member of Congress" and "prison" tending to appear together in the same articles.
Negative publicity can sully lobbying's image to a great extent:
- high-profile cases of lobbying fraud such as Abramoff's;
- dubious father-son exchange-of-favors ties;
- public officials such as Newt Gingrich being accused and then denying accusations of having done lobbying and earning $1.6 million from "strategic advice".
There are a variety of reasons why lobbying has acquired a negative image in public consciousness. While there is much disclosure, much of it happens in hard-to-disclose personal meetings, and the resulting secrecy and confidentiality can serve to lower lobbying's status.
Revolving door:
Main article: Revolving door (politics)
Since the 1980s, congresspersons and staffers have been "going downtown"—becoming lobbyists—and the big draw is money. The "lucrative world of K Street" means that former congresspersons with even "modest seniority" can move into jobs paying $1 million or more annually, without including bonuses for bringing in new clients.
The general concern of this revolving-door activity is that elected officials—persons who were supposed to represent the interests of citizens—have instead become entangled with the big-money interests of for-profit corporations and interest groups with narrow concerns, and that public officials have been taken over by private interests.
In July 2005, Public Citizen published a report entitled "The Journey from Congress to K Street": the report analyzed hundreds of lobbyist registration documents filed in compliance with the Lobbying Disclosure Act and the Foreign Agents Registration Act among other sources. It found that since 1998, 43 percent of the 198 members of Congress who left government to join private life have registered to lobby.
A similar report from the Center for Responsive Politics found 370 former members were in the "influence-peddling business", with 285 officially registered as federal lobbyists, and 85 others who were described as providing "strategic advice" or "public relations" to corporate clients.
The Washington Post described these results as reflecting the "sea change that has occurred in lawmakers' attitudes toward lobbying in recent years." The report included a case study of one particularly successful lobbyist, Bob Livingston, who stepped down as Speaker-elect and resigned his seat in 1999.
In the six years since his resignation, The Livingston Group grew into the 12th largest non-law lobbying firm, earning nearly $40 million by the end of 2004. During roughly the same time period, Livingston, his wife, and his two political action committees (PACs) contributed over $500,000 to the campaign funds of various candidates.
Numerous reports chronicle the revolving door phenomenon. A 2011 estimate suggested that nearly 5,400 former congressional staffers had become federal lobbyists over a ten-year period, and 400 lawmakers made a similar jump. It is a "symbiotic relationship" in the sense that lobbying firms can exploit the "experience and connections gleaned from working inside the legislative process", and lawmakers find a "ready pool of experienced talent."
There is movement in the other direction as well: one report found that 605 former lobbyists had taken jobs working for lawmakers over a ten-year period. A study by the London School of Economics found 1,113 lobbyists who had formerly worked in lawmakers' offices.
The lobbying option is a way for staffers and lawmakers to "cash in on their experience", according to one view. Before the 1980s, staffers and aides worked many years for congresspersons, sometimes decades, and tended to stay in their jobs; now, with the lure of higher-paying lobbying jobs, many would quit their posts after a few years at most to "go downtown."
And it is not just staffers, but lawmakers as well, including high-profile ones such as congressperson Richard Gephardt. He represented a "working-class" district in Missouri for many years but after leaving Congress, he became a lobbyist. In 2007, he began his own lobbying firm called "Gephardt Government Affairs Group" and in 2010 it was earning close to $7 million in revenues with clients including Goldman Sachs, Boeing, Visa Inc., Ameren Corporation, and Waste Management Inc..
Senators Robert Bennett and Byron Dorgan became lobbyists too. Mississippi governor Haley Barbour became a lobbyist. In 2010, former representative Billy Tauzin earned $11 million running the drug industry's lobbying organization, called Pharmaceutical Research and Manufacturers of America (PhRMA).
Tauzin's bill to provide prescription drug access to Medicare recipients gave major concessions to the pharmaceutical industry: (1) Medicare was prevented from negotiating lower costs for prescription drugs (2) the reimportation of drugs from first world countries was not allowed (3) Medicare D was undermined by a policy of Medigap D.
After the bill passed a few months later, Tauzin retired from Congress and took an executive position at PhRMA to earn an annual salary of $2 million. Many former representatives earned over $1 million in one year, including James Greenwood and Daniel Glickman.
Insider's Game:
A similar concern voiced by critics of lobbying is that Washington politics has become dominated by elites, and that it is an "insider's game" excluding regular citizens and which favors entrenched firms.
Individuals generally can not afford to lobby, and critics question whether corporations with "deeper pockets" should have greater power than regular persons. In this view, the system favors the rich, such that the "rich have gotten richer, the weak weaker", admits lobbyist Gerald Cassidy.
There is concern that those having more money and better political connections can exert more influence than others. However, analyst Barry Hessenius made a case that the excessive for-profit lobbying could be counteracted if there were more efforts to increase nonprofit lobbying and boost their effectiveness. There is so much money that it has been described as a "flood" that has a "corrupting influence", so that the United States appears to be "awash" in interest groups.
If coalitions of different forces battle in the political arena for favorable treatment and better rules and tax breaks, it can be seen as fair if both sides have equal resources and try to fight for their interests as best they can.
Gerald Cassidy said: "In a lot of areas, the stakes are between big companies, and it's hard to argue that one solution is better than another solution with regard to the consumer's interest ... The issue ... is whether Company A's solution, or Company B's solution, based on their technology or their footprint, is the right one."
— Lobbyist Gerald Cassidy
A related but slightly different criticism is that the problem with lobbying as it exists today is that it creates an "inequity of access to the decision-making process". As a result, important needs get left out of the political evaluation, such that there are no anti-hunger lobbies or lobbies seeking serious solutions to the problem of poverty.
Nonprofit advocacy has been "conspicuously absent" from lobbying efforts, according to one view. Critics suggest that when a powerful coalition battles a less powerful one, or one which is poorly connected or underfunded, the result may be seen as unfair and potentially harmful for the entire society.
The increasing number of former lawmakers becoming lobbyists has led Senator Russ Feingold (D-WI) to propose paring back the many Capitol Hill privileges enjoyed by former senators and representatives. His plan would deprive lawmakers-turned-lobbyists of privileges such as unfettered access to otherwise "members only" areas such as the House and Senate floors and the House gym.
Choice-making problems:
A concern among many critics is that influence peddling hurts overall decision making, according to this criticism. Proposals with merit are dropped in favor of proposals backed by political expediency.
An example cited in the media is the recent battling between food industry lobbyists and healthcare lobbyists regarding school lunches. A group supported by the United States Department of Agriculture proposed healthier lunches as a way to combat childhood obesity by limiting the number of potatoes served, limiting salty foods, and adding more fresh vegetables, but this group was countered by a strong food lobby backed by Coca-Cola, Del Monte, and makers of frozen pizza.
The food lobbyists succeeded in blocking the proposed reforms, even writing rules suggesting that the tomato paste on a pizza qualified as a vegetable, but overall, according to critics, this case appeared to be an example where business interests won out over health concerns.
Critics use examples such as these to suggest that lobbying distorts sound governance. A study by IMF economists found that the "heaviest lobbying came from lenders making riskier loans and expanding their mortgage business most rapidly during the housing boom," and that there were indications that heavy-lobbying lenders were more likely to receive bailout funds. The study found a correlation between lobbying by financial institutions and excessive risk-taking during 2000–2007, and the authors concluded that "politically active lenders played a role in accumulation of risks and thus contributed to the financial crisis".
Another study suggested that governments tend to protect domestic industries, and have a habit of shunting monies to ailing sectors; the study suggested that "it is not that government policy picks losers, it is that losers pick government policy." One critic suggested that the financial industry has successfully blocked attempts at regulation in the aftermath of the 2008 financial collapse.
Governmental focus:
Critics have contended that when lawmakers are drawn into battles to determine issues such as the composition over school lunches or how much an ATM fee should be, more serious issues such as deficit reduction or global warming or social security are neglected. It leads to legislative inertia.
The concern is that the preoccupation with what are seen as superficial issues prevents attention to long-term problems. Critics suggested that the 2011 Congress spent more time discussing per-transaction debit-card fees while neglecting issues seen as more pressing.
Methodological problems:
In this line of reasoning, critics contend that lobbying, in and of itself, is not the sole problem, but only one aspect of a larger problem with American governance. Critics point to an interplay of factors: citizens being uninvolved politically; congresspersons needing huge sums of money for expensive television advertising campaigns; increased complexity in terms of technologies; congresspersons spending three days of every week raising money; and so forth.
Given these temptations, lobbying came along as a logical response to meet the needs of congresspersons seeking campaign funds and staffers seeking personal enrichment. In a sense, in competitive politics, the common good gets lost:
I know what my client wants; no one knows what the common good is.
— Anonymous lobbyist
A lobbyist can identify a client's needs. But it is hard for a single individual to say what is best for the whole group. The intent of the Constitution's Framers was to have built-in constitutional protections to protect the common good, but according to these critics, these protections do not seem to be working well:
- The structure of representative government, elected by the people, was to be our system's built-in protection of the whole of us—fairly elected officeholders were to represent their constituent groups, free from any obligations to special interests.
- Unfortunately, money has corrupted the system and compromised both the fairness of the electoral process as well as the independence and impartiality of elected officials.
Lawrence Lessig, a professor at Harvard Law School and author of Republic, Lost, suggested that the moneyed persuasive power of special interests has insinuated itself between the people and the lawmakers.
He quoted congressperson Jim Cooper who remarked that Congress had become a "Farm League for K Street" in the sense that congresspersons were focused on lucrative lobbying careers after Congress rather than on serving the public interest while in office. In a speech, Lessig suggested the structure of incentives was such that legislators were tempted to propose unnecessary regulations as a way to further lobbying industry activity.
According to one view, major legislation such as proposed Wall Street reforms have spurred demand for "participating in the regulatory process." Lessig suggested the possibility that it was not corporations deciding to take up lobbying, but Congress choosing to debate less-than-important issues to bring well-heeled corporations into the political fray as lobbyists.
As a result of his concerns, Lessig has called on state governments to summon a Second Constitutional Convention to propose substantive reform. Lessig believes that a constitutional amendment should be written to limit political contributions from non-citizens, including corporations, anonymous organizations, and foreign nationals.
Our current tax system with all its complexities is in part designed to make it easier for candidates, in particular congressmen, to raise money to get back to congress ... All sorts of special exceptions which expire after a limited period of time are just a reason to pick up the phone and call somebody and say 'Your exception is about to expire, here’s a good reason for you to help us fight to get it to extend.'
"And that gives them the opportunity to practice what is really a type of extortion – shaking the trees of money in the private sector into their campaign coffers so that they can run for congress again — Lawrence Lessig, 2011
Scholars such as Richard Labunski, Sanford Levinson, Glenn Reynolds, Larry Sabato as well as newspaper columnist William Safire, and activists such as John Booth of RestoringFreedom.org have called for constitutional changes that would curb the powerful role of money in politics.
Expansion of lobbying:
Law in the United States is generally made by Congress, but as the federal government has expanded during much of the twentieth century, there are a sizeable number of federal agencies, generally under the control of the president. These agencies write often industry-specific rules and regulations regarding such things as automobile safety and air quality.
Unlike elected congresspersons who are constantly seeking campaign funds, these appointed officials are harder to influence, generally. However, there are indications that lobbyists seek to expand their influence from the halls of Congress deeper into the federal bureaucracy.
President Obama pledged during the election campaign to rein in lobbying. As president in January 2009, he signed two executive orders and three presidential memoranda to help ensure his administration would be more open, transparent, and accountable. These documents attempted to bring increased accountability to federal spending and limit the influence of special interests, and included a lobbyist gift ban and a revolving door ban.
In May 2009, the Recovery Act Lobbying Rules. The Executive Branch Reform Act, H.R. 985, was a bill which would have required over 8,000 Executive Branch officials to report into a public database nearly any "significant contact" from any "private party." The purpose was to identify lobbying activity. The bill was supported by proponents as an expansion of "government in the sunshine" including groups such as Public Citizen.
But the proposals ran into serious opposition from various groups including the lobbying industry itself. Opponents argued that the proposed reporting rules would have infringed on the right to petition, making it difficult not just for lobbyists, but for regular citizens to communicate their views on controversial issues without having their names and viewpoints entered into a government database.
Opposition groups suggested that although the proposed rules were promoted as a way to regulate "lobbyists," persons described as a "private party" could be practically anybody, and that anybody contacting a federal official might be deemed to be a "lobbyist".
The U.S. Department of Justice raised constitutional and other objections to the bill. Opponents mobilized over 450 groups including the U.S. Chamber of Commerce and National Association of Realtors with letter writing campaigns against the proposed restrictions.
Lobbyist Howard Marlowe argued in a "stern letter" that the restriction on gift-giving to federal employees would create "fear of retribution for political donations":
- "Since your announcement to seek the Presidency you have consistently attacked the honorable profession of lobbying ... Lobbyists play an important role in the legislative process, serving as educators to elected officials. It is in the best interest to government to have informed individuals who serve as experts in every arena of public policy.
- Our ability to access and navigate the legislative process and push issues forward through a bureaucratic cluster is a vital service to the nation. The Draft Order would inhibit one of the most vital tools in the advocate's arsenal by creating fear of retribution for political donations. Making this kind of disclosure a part of the bidding process tarnishes a competition based on qualifications, adds an unneeded level of bureaucracy, and endangers the protection of free speech afforded to all Americans by the First Amendment of the Constitution..."
In 2011, there were efforts to "shift regulatory power from the executive branch to Congress" by requiring that any "major rule" which may cost the economy more than $100 million must be decided by Congress with an up-or-down vote. But skeptics think that such a move proposed by Republican lawmakers could "usher in a lobbying bonanza from industry and other special-interest groups" to use campaign contributions to reshape the regulatory milieu.
Potential for reform:
Critics suggest that Congress has the power to fix itself, but is reluctant to sacrifice money and power. One report suggested that those in control had an "unbroken record of finding ways to navigate around reform laws or turn regulatory standards to their own advantage."
Arguments for lobbying:
There are counterarguments that the system is working as it should, despite being rather messy. According to this line of argument, the Madisonian view of politics—in which factions were supposed to compete with other factions—is working exactly as it should.
Competing factions, or in this case, competing interest groups, square off. Battling happens within the federal government, but instead of by settling arguments by elections, arguments are settled by powerful interest groups fighting each other, often financially. And it might appear to members of groups which lost in a lobbying battle that the reason for their loss was that the other side lobbied unfairly using more money.
There are numerous instances in which opposed lobbies stalemate, and instances in which these stalemates have been seen as a positive result. And sometimes powerful financial interests lose the battle.
Lobbying brings valuable information to policymakers, according to another argument in favor of lobbying. Since lobbyists often become highly knowledgeable about a specific issue by studying it in depth over years, they can bring considerable expertise to help legislators avoid errors as well as grasp the nuances of complex issues.
This information can also help Congress oversee numerous federal agencies which often regulate complex industries and issue highly detailed and specific rulings. Accordingly, it is difficult for Congress to keep track of what these agencies do. It has been argued that lobbyists can help Congress monitor this activity by possibly raising "red flags" about proposed administrative rulings.
Further, congresspersons can quickly gauge where they stand about a proposed administrative ruling simply by seeing which lobbying groups support the proposal, and which oppose it.
Another argument in support of lobbying is that different interest groups and lobbyists, while trying to build coalitions and win support, often amend or soften or change their positions in this process, and that interest groups and lobbyists regulate each other, in a sense.
But a more general sentiment supporting the lobbying arrangement is that every citizen can be construed as being "represented" by dozens of special interests:
Every citizen is a special interest... Blacks, consumers, teachers, pro-choicers, gun control advocates, handicapped people, aliens, exporters, and salesmen – are all special interests...
There is not an American today who is not represented (whether he or she knows it or not) by at least a dozen special interest groups. ... One person's special interest is another person's despotism...— Donald E. deKieffer, author of The Citizen's Guide to Lobbying Congress, 2007
If powerful groups such as the oil industry succeed in winning a battle in government, consumers who drive gas-powered cars can benefit a bit, according to this view. Even readers of Wikipedia could be conceived as being a special interest and represented by various lobbies.
For example, opponents of the Stop Online Piracy Act believed that the act might restrict sites such as Wikipedia; on January 18, 2012, as a form of protest and as a way to encourage readers and contributors of English Wikipedia to write their congresspersons, the online encyclopedia was "blacked out for a day as part of an effort to lobby the government.
Another view in support of lobbying is that it serves a helpful purpose as helping guard against extremism. According to this view, lobbying adds "built-in delays" and permits and encourages opposing lobbies to battle. In the battling, possibly damaging decrees and incorrect decisions are stymied by seemingly unhelpful delays and waits.
A slightly different view is that lobbying is no different from other professions: "Lobbying is no more perfect than is the practice of law or the practice of medicine."
— Lobbyist Gerald S. J. Cassidy, 2007
The regulatory environment:
Disclosure and domestic regulations:
Generally, the United States requires systematic disclosure of lobbying, and it may be one of the few countries to have such extensive requirements. Disclosure in one sense allows lobbyists and public officials to justify their actions under the banner of openness and with full compliance of the law.
The rules often specify how much a lobbyist can spend on specific activities, and how to report expenses; many of the laws and guidelines are specified in the Lobbying Disclosure Act of 1995.
Transparency and disclosure requirements mean that there are volumes of statistics available for all kinds of analyses—by journalists, by the public, by rival lobbying efforts. Researchers can subdivide lobbying expenditures by numerous breakdowns, such as by contributions from energy companies.
Sometimes defining clearly who is a "lobbyist" and what precisely are lobbying activities can be difficult. According to the Lobbying Disclosure Act, several authorized definitions include:
- Lobbying activities means "lobbying contacts and efforts in support of such contacts, including preparation and planning activities, research and other background work that is intended, at the time it is performed, for use in contacts, and coordination with the lobbying activities of others."
- Lobbying contact means "any oral or written communication (including an electronic communication) to a covered executive branch official or a covered legislative branch official".
Still, distinguishing lobbyists from a strategic adviser can be difficult, since the duties of each can often overlap and are hard to define precisely.
There have been issues raised about what constitutes the difference between a lobbyist and a bundler; one report described bundlers as "supporters who contribute their own money to his campaign and solicit it from others", and there was a question whether such persons were really lobbyists involved with raising campaign monies for the election of Barack Obama, and whether Obama had broken his own pledge not to receive money from lobbyists.
The legal ramifications of lobbying are further intertangled with aspects of campaign finance reform, since lobbyists often spend time seeking donations for the reelection efforts of congresspersons; sorting out these issues can pose ethical challenges.
There are numerous regulations governing the practice of lobbying, often ones requiring transparency and disclosure. People paid to lobby must register with the secretary of the Senate and the clerk of the House of Representatives within 45 days of contacting a legislator for the first time, or 45 days after being employed.
An exception is that lobbyists who earn less than $3,000 per client for each fiscal quarter, or whose total lobbying expenses are less than $11,500 each quarter, do not need to register.
Part-time lobbyists are exempt from registering unless they spend more than 20% of their working hours doing lobbying activities in any quarter. If lobbyists have two or more contacts with a legislator as a lobbyist, then they must register.
Requirements for registering also apply to companies that specialize in lobbying, or ones that have an in-house lobbyist, particularly if they spend more than $11,500 on lobbying.
Generally, nonprofit organizations, other than churches, are exempt from registering if they hire an outside lobbying firm. Filing must be made each quarter, and a separate file is needed for each of the lobbyist's clients, and include information such as the name and title of the client, an estimate of lobbying expenses, and an estimate of income the lobbyist achieved after doing the lobbying.
States, in addition, are moving in the direction of greater disclosure and transparency regarding lobbying activities. California has an online database called Cal-Access although there were reports that it has been underfunded. Money collected from registration fees are often used to pay for the disclosure services such as Cal-Access.
There were complaints in Illinois that the disclosure requirements were often not rigorous enough and allowed lobbyists to work "without public notice" and with possible "conflicts of interest".
Many local municipalities are requiring legislative agents register as lobbyists to represent the interests of clients to local city council members such as in the swing state of Ohio cities such as Columbus and Cincinnati.
Laws requiring disclosure have been more prevalent in the twentieth century. In 1946, there was a so-called "sunshine law" requiring lobbyists to disclose what they were doing, on whose behalf, and how much they received in payment. The resulting Federal Regulation of Lobbying Act of 1946 governed lobbying rules up until 1995 when the Lobbying Disclosure Act replaced it.
The Federal Election Campaign Act of 1971, later amended in 2002 as the McCain Feingold Act, had rules governing campaign contributions. Each branch of Congress has rules as well. Legislation generally requires reports containing an accounting of major expenditures as well as legislation that was influenced; the wording of some of the pertinent laws can be found in 2 U.S.C. ch. 26.
Lobbying law is a constantly evolving field; the American Bar Association published a book of guidelines in 2009 with over 800 pages. The laws are often rather specific, and when not observed, can lead to serious trouble.
Failing to file a quarterly report, or knowingly filing an incorrect report, or failing to correct an incorrect report, can lead to fines up to $200,000 and imprisonment up to five years.
Penalties can apply to lobbyists who fail to list gifts made to a legislator. In other situations, the punishment can be light: for example, Congressional aide-turned-lobbyist Fraser Verrusio spent a few hours in jail after pleading guilty to taking a client to a World Series baseball game and failing to report it.
Tax rules can apply to lobbying. In one situation, the charity Hawaii Family Forum risked losing its tax-exempt status after it had engaged in lobbying activity; federal tax law requires charities such as that one to limit their lobbying to 20% of their overall expenditures or else be eligible for being taxed like a for-profit corporation.
Lobbyists sometimes support rules requiring greater transparency and disclosure:
"Our profession is at a critical point where we can either embrace the constructive changes and reforms by Congress or we can seek out loopholes and continue the slippery slide into history along side the ranks of snake oil salesmen" — Lobbyist Gerald S. J. Cassidy, 2007
Scandals can spur impetus towards greater regulation as well. The Jack Abramoff Indian lobbying scandal, which started in the 1990s and led to a guilty plea in 2006, inspired the Legislative Transparency and Accountability Act of 2006 (S. 2349). According to Time Magazine the Senate bill:
- barred lobbyists themselves from buying gifts and meals for legislators, but left a loophole in which firms and organizations represented by those lobbyists could still dole out gifts and perks;
- allowed privately funded trips if lawmakers got prior approval from a commissioned ethics committee;
- required lobbyists to file frequent and detailed activity reports and have them posted publicly. The bill was approved in 2006 by a 90–8 vote.
In 1995, the 104th Congress tried to reform Lobbying by passing the Lobbying Disclosure Act of 1995 which defines and requires lobbyists who are compensated for their actions to register with congressional officials. The legislation was later amended by the Lobbying Disclosure Technical Amendments Act of 1998.
There were subsequent modifications leading to the Honest Leadership and Open Government Act of 2007. The Lobbying Transparency and Accountability Act of 2006 (H.R. 4975) legislation modified Senate rules, although some senators and a coalition of good-government groups assailed the bill as being too weak.
The Honest Leadership and Open Government Act of 2007 was a comprehensive ethics and lobbying reform bill, (H.R. 2316), which passed in 2007 in the House and Congress by a large majority. A parallel Senate version of the legislation, (S. 1), passed in 2007 by a nearly unanimous vote. After the House & Senate resolved their differences and passed an amended revision, President Bush signed the enrolled bill into law (Pub.L. 110–81).
Some states have considered banning government employees permanently from lobbying on issues they had worked on. For example, there was a proposal along these lines to prevent county employees in Maryland from ever lobbying on issues they had worked on. The proposal insisted that county officials post financial disclosures as well as prohibit gifts from contractors.
Jack Abramoff, emerging from prison, has spoken publicly about lobbying. In his view, regulations designed to rein in the excesses of lobbying have not been effective, and reforms and regulations have not cleaned up the system "at all".
Abramoff said lobbyists could "find a way around just about any reform Congress enacted", and gave an example: You can't take a congressman to lunch for $25 and buy him a hamburger or a steak or something like that ... But you can take him to a fund-raising lunch and not only buy him that steak, but give him $25,000 extra and call it a fund-raiser – and have all the same access and all the same interactions with that congressman.
— Jack Abramoff, commenting on 60 Minutes, according to CNN
A similar view suggested that lobbying reform efforts have been "fought tooth and nail to prevent its passage" since the people with the power to reform would curtail their own powers and income flows.
Foreign lobbying:
Since commerce worldwide is becoming more integrated, with firms headquartered in one country increasingly doing business in many other countries, it is logical to expect that lobbying efforts will reflect the increasing globalization. Sometimes foreign-owned corporations will want to lobby the United States government, and in such instances, new rules can apply, since it can be particularly thorny resolving whether national security interests are at stake and how they might be affected.
In 1938, the Foreign Agents Registration Act required an explicit listing of all political activities undertaken by a lobbyist on behalf of any foreign principal. There were serious concerns about lobbying firms representing foreign entities – and potentially values opposed to American principles – after Axis power agitprop was planted in American soils during World War II through the efforts of public-relations specialist Ivy Lee's proxy firm "German Dye Trust".
As a result, in 1938, the Foreign Agents Registration Act or FARA was passed by Congress, and this law required foreign lobbyists to share information about their contracts with the Justice Department. FARA's mandate was to disclose to policymakers the sources of information that influenced public opinions, policies, and law.
However, the goal was not to restrict the speech of the lobbyist or the content of the lobbying. Nonetheless, it was estimated that less than half of foreign lobbyists who should have registered under FARA actually did so.
By the 1960s, perceived failures in FARA’s enforcement led to public outcry against lobbying excesses, while revelations of foreign bribery circulated regularly well into the early 1970s. This prompted legislation proposed to reduce the autonomy of foreign firms, most of which was not ratified for concerns over a lack of constitutionality.
While the House of Representatives passed a rule to increase public scrutiny of foreign lobbying, one estimate was that about 75% of lobbyists were exempt from a registration requirement, including individuals representing foreign interests.
A general trend is that the number of lobbyists representing foreign companies is rising. The case of Washington’s APCO Worldwide, a firm which represented the dictatorship of General Sani Abacha of Nigeria in 1995 whose regime had hanged nine pro-democracy activists, attracted negative publicity.
While current law forbids foreign nations from contributing to federal, state, or local elections, loopholes allow American subsidiaries of foreign corporations to establish so-called separated segregated funds or SSFs to raise money. According to one view, the definition of which firms are defined as "foreign" was unclear, and the lack of clarity undermines the ability to regulate their activity.
Foreign-funded lobbying efforts include those of Israel, Saudi Arabia, Turkey, Egypt, Pakistan, Libya, and China lobbies. In 2010, foreign governments spent approximately $460 million on lobbying Congress and the U.S. Government. Between 2015–2017, the Saudi Arabia paid $18 million to 145 registered lobbyists to influence the U.S. government.
While Congress has tried to quell criticisms against the leverage of domestic lobbying firms by updating domestic lobbying legislation – such as the revision of the Lobbyist Disclosure Act in 1997)—there was a report that its inaction in rectifying loopholes in foreign lobbying regulation has led to scandals. There was a report of an upsurge of lobbying by foreign-owned U.S. subsidiaries against Democratic efforts to limit campaign spending in early 2010.
The proposed was to restrict lobbying by U.S. subsidiaries of foreign firms. In 2011, the Chinese firm Alibaba hired a lobbying firm in Washington when it began contemplating a purchase of the U.S. firm Yahoo!. There was a case in which a lobbying effort described as "extraordinary" was trying to change the designation of a fringe Iranian opposition group from being a terrorist organization to being a benign organization.
Lobbyists seeking to downgrade the designation hired influential foreign affairs officials, including former CIA directors, a former FBI director, and others to advocate for the change of designation. But there have been others accused of illegally lobbying for foreign nations or who failed to register as a foreign agen who may face prison time as a result.
For more about Political Lobbying in the United States, click on any of the following blue hyperlinks:
- United States Chamber of Commerce
- History of lobbying in the United States
- Political action committee
- National Rifle Association
- Second Constitutional Convention of the United States
- AARP (formerly the American Association of Retired Persons)
- Honest Leadership and Open Government Act of 2007
- Israel lobby in the United States
- American Israel Public Affairs Committee
- Diaspora politics in the United States
- China Lobby
- Turkish lobby in the United States
- Libya lobby in the United States
- Saudi Arabia lobby in the United States
- Fossil fuels lobby
- Florida Institute of CPAs
- Albanian American Civic League
- American Automobile Association
- Center for Responsive Politics
- Arab lobby in the United States
- Jack Abramoff Indian lobbying scandal
- Jerry Lewis - Lowery lobbying firm controversy
- Lobbying Disclosure Act of 1995
- Money loop
- Mothers Against Drunk Driving
- United States v. Harriss
- NARFE (National Active and Retired Federal Employees)
- Lobbying Database from OpenSecrets.org
- FollowtheMoney.org
- Government Accountability Groups (from "500 Leading U.S. Progressive Organizations by Category")
- Sourcewatch
- Lobbyists.info database of lobbyists and government relations professionals
- Open Secrets glossary
- Lawrence Lessig TED talk on lobbying
- US Senate Lobbying-Database Search
- US House of Representatives-Lobby Contributions Search
- Report on AIPAC lobbying
- Lobbyists 4 Good
- Moral lobbying techniques for persuading state legislators
Citizens United v. Federal Election Commission, 558 U.S. 310 (2010), is a landmark United States Supreme Court case concerning campaign finance.
The Court held that the free speech clause of the First Amendment prohibits the government from restricting independent expenditures for political communications by corporations, including nonprofit corporations, labor unions, and other associations.
The case arose after Citizens United, a conservative non-profit organization, sought to air and advertise a film critical of Democratic presidential candidate Hillary Clinton shortly before the 2008 Democratic primary elections.
This violated the 2002 Bipartisan Campaign Reform Act, which prohibited any corporation or labor union from making an "electioneering communication" within 30 days of a primary or 60 days of an election, or making any expenditure advocating the election or defeat of a candidate at any time.
In a majority opinion joined by four other justices, Associate Justice Anthony Kennedy held that the Bipartisan Campaign Reform Act's prohibition of all independent expenditures by corporations and unions violated the First Amendment's protection of free speech.
The Court overruled Austin v. Michigan Chamber of Commerce (1990), which had allowed different restrictions on speech-related spending based on corporate identity, as well as a portion of McConnell v. FEC (2003) that had restricted corporate spending on electioneering communications.
The ruling effectively freed labor unions and corporations to spend money on electioneering communications and to directly advocate for the election or defeat of candidates. In his dissenting opinion, Associate Justice John Paul Stevens argued that Court's ruling represented "a rejection of the common sense of the American people, who have recognized a need to prevent corporations from undermining self government."
The decision remains highly controversial, generating much public discussion and receiving strong support and opposition from various groups. Senator Mitch McConnell commended the decision, arguing that it represented "an important step in the direction of restoring the First Amendment rights".
By contrast, President Barack Obama stated that the decision "gives the special interests and their lobbyists even more power in Washington". The ruling had a major impact on campaign finance, allowing unlimited election spending by corporations and labor unions and fueling the rise of Super PACs. Later rulings by the Roberts Court, including McCutcheon v. FEC (2014), would strike down other campaign finance restrictions.
Click on any of the following blue hyperlinks for more about Citizens United v. Federal Election Commission Supreme Court Ruling:
- Case summary
- Background
- In the District Court
- Before the Supreme Court
- Decision
- Subsequent developments
- Further court rulings
- Legislative responses
- Political impact
- See also:
- 1996 United States campaign finance controversy
- 2009 term opinions of the Supreme Court of the United States
- Bank of the United States v. Deveaux (1809)
- James Bopp
- David Bossie
- Issue advocacy ads
- US corporate law
- Bowman v United Kingdom [1998] ECHR 4, (1998) 26 EHRR 1, case under the European Convention on Human Rights allowing restrictions on spending money to promote political equality
- Harper v. Canada (Attorney General) [2004] SCR 827, restrictions on spending in Canada
- Animal Defenders International v United Kingdom [2013] ECHR 362, the leading case in Europe held that the UK's total ban on political advertising was compatible with freedom of expression "given the danger of unequal access based on wealth and to political advertising" which goes "to the heart of the democratic process."
- Electoral reform in the United States, which discusses efforts to overturn Citizens United among other electoral reform initiatives.
- McConnell v. FEC
- Buckley v. Valeo
- McCutcheon v. FEC
- FEC v. National Conservative PAC
- FEC v. Wisconsin Right to Life, Inc.
- FEC v. Massachusetts Citizens for Life
End Citizens United (ECU) is a political action committee in the United States.
The organization is working to reverse the U.S. Supreme Court 2010 decision in Citizens United v. Federal Election Commission,(see above) which deregulated limits on independent expenditure group spending for (or against) specific candidates.
ECU is focused on driving larger campaign donations out of politics with a goal to elect "campaign-finance reform champions" to Congress by contributing and raising money for these candidates as well as running independent expenditures End Citizens United was founded in 2015, operating in its first election cycle during 2016 with more than $25 million in funding.
The organization has endorsed Democratic candidates such as Zephyr Teachout, Hillary Clinton, Russ Feingold, Beto O'Rourke, Elizabeth Warren and Jon Ossoff.
ECU was one of the largest outside groups funding the campaigns of U.S. Senators Maggie Hassan and Catherine Cortez Masto during the 2016 election, spending a combined $4.4 million on the races. End Citizens United announced that it had raised more than $7.5 million from grassroots donations by mid-2017, and planned to raise $35 million for the 2018 election cycle.
In the spring of 2018, an anonymous U.S.-based contractor paid at least 3,800 micro job workers to manipulate what stories would come up when people searched for the PAC via Google.
Click on any of the following blue hyperlinks for more about "End Citizens United":
Think Tanks, including a List of Think Tanks
- YouTube Video: What is a think tank?
- YouTube Video: The Influence of 'Think Tanks' in US Policy
- YouTube Video: What Should Think Tanks Do? (WoodrowWilsonCenter)
A think tank or policy institute is a research institute which performs research and advocacy concerning topics such as social policy, political strategy, economics, military, technology, and culture.
Most policy institutes are non-profit organisations, which some countries such as the United States and Canada provide with tax exempt status. Other think tanks are funded by governments, advocacy groups, or corporations, and derive revenue from consulting or research work related to their projects.
The following article lists global policy institutes according to continental categories, and then sub-categories by country within those areas. These listings are not comprehensive, given that more than 7,500 think tanks exist worldwide.
Types of Think Tanks:
Think tanks vary by ideological perspectives, sources of funding, topical emphasis and prospective consumers. Some think tanks, such as The Heritage Foundation, which promotes conservative principles, and the Center for American Progress are more partisan in purpose. Others, including the Tellus Institute, which emphasizes social and environmental topics, are more issue-oriented groups.
Funding sources and the consumers intended also define the workings of think tanks. Some receive direct government assistance, while others rely on private individual or corporate donors. This will invariably affect the degree of academic freedom within each policy institute and to whom or what the institution feels beholden.
Funding may also represent who or what the institution wants to influence; in the United States, for example, "Some donors want to influence votes in Congress or shape public opinion, others want to position themselves or the experts they fund for future government jobs, while others want to push specific areas of research or education."
A new trend, resulting from globalization, is collaboration between policy institutes in different countries. For instance, the Carnegie Endowment for International Peace operates offices in Washington, D.C., Beijing, Beirut, Brussels and Moscow.
The Think Tanks and Civil Societies Program (TTCSP) at the University of Pennsylvania, led by Dr. James McGann, annually rates policy institutes worldwide in a number of categories and presents its findings in the Global Go-To Think Tanks rating index.
However, this method of the study and assessment of policy institutes has been criticized by researchers such as Enrique Mendizabal and Goran Buldioski, Director of the Think Tank Fund, assisted by the Open Society Institute.
Several authors have indicated a number of different methods of describing policy institutes in a way that takes into account regional and national variations. For example:
Alternatively, one could use some of the following criteria:
Advocacy by think tanks:
In some cases, corporate interests and political groups have found it useful to create policy institutes, advocacy organizations, and think tanks.
For example, The Advancement of Sound Science Coalition was formed in the mid-1990s to dispute research finding an association between second-hand smoke and cancer.
According to an internal memorandum from Philip Morris Companies referring to the United States Environmental Protection Agency (EPA), "The credibility of the EPA is defeatable, but not on the basis of ETS [environmental tobacco smoke] alone,... It must be part of a larger mosaic that concentrates all the EPA's enemies against it at one time."
According to the Fairness and Accuracy in Reporting, both left-wing and right-wing policy institutes are often quoted and rarely identified as such. The result is that think tank "experts" are sometimes depicted as neutral sources without any ideological predispositions when, in fact, they represent a particular perspective.
In the United States, think tank publications on education are subjected to expert review by the National Education Policy Center's "Think Twice" think tank review project.
A 2014 New York Times report asserted that foreign governments buy influence at many United States think tanks.
According to the article: "More than a dozen prominent Washington research groups have received tens of millions of dollars from foreign governments in recent years while pushing United States government officials to adopt policies that often reflect the donors’ priorities."
Global think tanks:
Article: List of think tanks
United States:
See also: List of think tanks in the United States
As the classification is most often used today, the oldest American think tank is the Carnegie Endowment for International Peace, founded in 1910. The Institute for Government Research, which later merged with two organizations to form the Brookings Institution, was formed in 1916.
Other early twentieth century organizations now classified as think tanks include:
The Great Depression and its aftermath spawned several economic policy organizations, such as the National Planning Association (1934), the Tax Foundation (1937), and the Committee for Economic Development (1943).
In collaboration with the Douglas Aircraft Company, the Air Force set up the RAND Corporation in 1946 to develop weapons technology and strategic defense analysis.
More recently, progressive and liberal think tanks have been established, most notably the Center for American Progress and the Center for Research on Educational Access and Leadership (CREAL). The organization has close ties to former United States President Barack Obama and other prominent Democrats.
Think tanks help shape both foreign and domestic policy. They receive funding from private donors, and members of private organizations. By 2013, the largest 21 think tanks in the US spent more than $1 billion per year. Think tanks may feel more free to propose and debate controversial ideas than people within government.
The progressive media watchgroup Fairness and Accuracy in Reporting (FAIR) has identified the top 25 think tanks by media citations, noting that from 2006 to 2007 the number of citations declined 17%.
The FAIR report reveals the ideological breakdown of the citations: 37% conservative, 47% centrist, and 16% liberal. Their data show that the most-cited think tank was the Brookings Institution, followed by the Council on Foreign Relations, the American Enterprise Institute, The Heritage Foundation, and the Center for Strategic and International Studies.
Recently in response to scrutiny about think tanks appearing to have a "conflict of interest" or lack transparency, executive vice president, Martin S. Indyk of Brookings Institution – the "most prestigious think tank in the world" – admitted that they had "decided to prohibit corporations or corporate-backed foundations from making anonymous contributions."
In August 2016, the New York Times published a series on think tanks that blur the line. One of the cases the journalists cited was Brookings, where scholars paid by a seemingly independent think tank "push donors' agendas amplifying a culture of corporate influence in Washington."
For example, in exchange for hundreds of thousands of dollars the Brookings Institution provided the publicly-traded company Lennar Corporation – one of the United States' largest home builders – with a significant advantage in pursuing their $US8 billion revitalization project in Hunters Point, San Francisco.
In 2014 Lennar's then-regional vice president in charge of the San Francisco revitalization, Kofi Bonner in 2014, was named as a Brookings senior fellow – a position as 'trusted adviser' that carries some distinction. Bruce Katz, a Brookings vice president, also offered to help Lennar Corporation "engage with national media to develop stories that highlight Lennar's innovative approach."
Government:
Government think tanks are also important in the United States, particularly in the security and defense field. These include the Center for Technology and National Security Policy at the National Defense University, the Center for Naval Warfare Studies at the Naval War College, and the Strategic Studies Institute at the U.S. Army War College.
The government funds, wholly or in part, activities at approximately 30 Federally Funded Research and Development Centers (FFRDCs). FFRDCs, are unique independent nonprofit entities sponsored and funded by the United States government to meet specific long-term technical needs that cannot be met by any other single organization.
FFRDCs typically assist government agencies with scientific research and analysis, systems development, and systems acquisition. They bring together the expertise and outlook of government, industry, and academia to solve complex technical problems.
These FFRDCs include:
Similar to the above quasi-governmental organizations are Federal Advisory Committees.
These groups, sometimes referred to as commissions, are a form of think tank dedicated to advising the US Presidents or the Executive branch of government. They typically focus on a specific issue and as such, might be considered similar to special interest groups.
However, unlike special interest groups these committees have come under some oversight regulation and are required to make formal records available to the public. Approximately 1,000 these advisory committees are described in the FACA searchable database.
See also:
Most policy institutes are non-profit organisations, which some countries such as the United States and Canada provide with tax exempt status. Other think tanks are funded by governments, advocacy groups, or corporations, and derive revenue from consulting or research work related to their projects.
The following article lists global policy institutes according to continental categories, and then sub-categories by country within those areas. These listings are not comprehensive, given that more than 7,500 think tanks exist worldwide.
Types of Think Tanks:
Think tanks vary by ideological perspectives, sources of funding, topical emphasis and prospective consumers. Some think tanks, such as The Heritage Foundation, which promotes conservative principles, and the Center for American Progress are more partisan in purpose. Others, including the Tellus Institute, which emphasizes social and environmental topics, are more issue-oriented groups.
Funding sources and the consumers intended also define the workings of think tanks. Some receive direct government assistance, while others rely on private individual or corporate donors. This will invariably affect the degree of academic freedom within each policy institute and to whom or what the institution feels beholden.
Funding may also represent who or what the institution wants to influence; in the United States, for example, "Some donors want to influence votes in Congress or shape public opinion, others want to position themselves or the experts they fund for future government jobs, while others want to push specific areas of research or education."
A new trend, resulting from globalization, is collaboration between policy institutes in different countries. For instance, the Carnegie Endowment for International Peace operates offices in Washington, D.C., Beijing, Beirut, Brussels and Moscow.
The Think Tanks and Civil Societies Program (TTCSP) at the University of Pennsylvania, led by Dr. James McGann, annually rates policy institutes worldwide in a number of categories and presents its findings in the Global Go-To Think Tanks rating index.
However, this method of the study and assessment of policy institutes has been criticized by researchers such as Enrique Mendizabal and Goran Buldioski, Director of the Think Tank Fund, assisted by the Open Society Institute.
Several authors have indicated a number of different methods of describing policy institutes in a way that takes into account regional and national variations. For example:
- Independent civil society think tanks established as non-profit organisations—ideologically identifiable or not;
- Policy research institutes affiliated with a university;
- Government-created or state sponsored think tanks;
- Corporate created or business affiliated think tanks;
- Political party think tanks and legacy or personal think tanks;
- Global (or regional) think tanks (with some of the above).
Alternatively, one could use some of the following criteria:
- Size and focus: e.g., large and diversified, large and specialized, small and specialized;
- Evolution of stage of development: e.g., first (small), second (small to large but more complex projects), and third (larger and policy influence) stages;
- Strategy, including: Funding sources (individuals, corporations, foundations, donors/governments, endowments, sales/events) and business model (independent research, contract work, advocacy);
- The balance between research, consultancy, and advocacy;
- The source of their arguments: Ideology, values or interests; applied, empirical or synthesis research; or theoretical or academic research (Stephen Yeo);
- The manner in which the research agenda is developed—by senior members of the think tank or by individual researchers, or by the think tank of their funders;
- Their influencing approaches and tactics (many researchers but an interesting one comes from Abelson) and the time horizon for their strategies:
- long term and short term mobilisation; Their various audiences of the think tanks (audiences as consumers and public -this merits another blog; soon) (again, many authors, but Zufeng provides a good framework for China);
- and Affiliation, which refers to the issue of independence (or autonomy) but also includes think tanks with formal and informal links to political parties, interest groups and other political players.
Advocacy by think tanks:
In some cases, corporate interests and political groups have found it useful to create policy institutes, advocacy organizations, and think tanks.
For example, The Advancement of Sound Science Coalition was formed in the mid-1990s to dispute research finding an association between second-hand smoke and cancer.
According to an internal memorandum from Philip Morris Companies referring to the United States Environmental Protection Agency (EPA), "The credibility of the EPA is defeatable, but not on the basis of ETS [environmental tobacco smoke] alone,... It must be part of a larger mosaic that concentrates all the EPA's enemies against it at one time."
According to the Fairness and Accuracy in Reporting, both left-wing and right-wing policy institutes are often quoted and rarely identified as such. The result is that think tank "experts" are sometimes depicted as neutral sources without any ideological predispositions when, in fact, they represent a particular perspective.
In the United States, think tank publications on education are subjected to expert review by the National Education Policy Center's "Think Twice" think tank review project.
A 2014 New York Times report asserted that foreign governments buy influence at many United States think tanks.
According to the article: "More than a dozen prominent Washington research groups have received tens of millions of dollars from foreign governments in recent years while pushing United States government officials to adopt policies that often reflect the donors’ priorities."
Global think tanks:
Article: List of think tanks
United States:
See also: List of think tanks in the United States
As the classification is most often used today, the oldest American think tank is the Carnegie Endowment for International Peace, founded in 1910. The Institute for Government Research, which later merged with two organizations to form the Brookings Institution, was formed in 1916.
Other early twentieth century organizations now classified as think tanks include:
- the Hoover Institution (1919),
- The Twentieth Century Fund (1919, and now known as the Century Foundation),
- the National Bureau of Economic Research (1920),
- the Council on Foreign Relations (1921), and the Social Science Research Council (1923).
The Great Depression and its aftermath spawned several economic policy organizations, such as the National Planning Association (1934), the Tax Foundation (1937), and the Committee for Economic Development (1943).
In collaboration with the Douglas Aircraft Company, the Air Force set up the RAND Corporation in 1946 to develop weapons technology and strategic defense analysis.
More recently, progressive and liberal think tanks have been established, most notably the Center for American Progress and the Center for Research on Educational Access and Leadership (CREAL). The organization has close ties to former United States President Barack Obama and other prominent Democrats.
Think tanks help shape both foreign and domestic policy. They receive funding from private donors, and members of private organizations. By 2013, the largest 21 think tanks in the US spent more than $1 billion per year. Think tanks may feel more free to propose and debate controversial ideas than people within government.
The progressive media watchgroup Fairness and Accuracy in Reporting (FAIR) has identified the top 25 think tanks by media citations, noting that from 2006 to 2007 the number of citations declined 17%.
The FAIR report reveals the ideological breakdown of the citations: 37% conservative, 47% centrist, and 16% liberal. Their data show that the most-cited think tank was the Brookings Institution, followed by the Council on Foreign Relations, the American Enterprise Institute, The Heritage Foundation, and the Center for Strategic and International Studies.
Recently in response to scrutiny about think tanks appearing to have a "conflict of interest" or lack transparency, executive vice president, Martin S. Indyk of Brookings Institution – the "most prestigious think tank in the world" – admitted that they had "decided to prohibit corporations or corporate-backed foundations from making anonymous contributions."
In August 2016, the New York Times published a series on think tanks that blur the line. One of the cases the journalists cited was Brookings, where scholars paid by a seemingly independent think tank "push donors' agendas amplifying a culture of corporate influence in Washington."
For example, in exchange for hundreds of thousands of dollars the Brookings Institution provided the publicly-traded company Lennar Corporation – one of the United States' largest home builders – with a significant advantage in pursuing their $US8 billion revitalization project in Hunters Point, San Francisco.
In 2014 Lennar's then-regional vice president in charge of the San Francisco revitalization, Kofi Bonner in 2014, was named as a Brookings senior fellow – a position as 'trusted adviser' that carries some distinction. Bruce Katz, a Brookings vice president, also offered to help Lennar Corporation "engage with national media to develop stories that highlight Lennar's innovative approach."
Government:
Government think tanks are also important in the United States, particularly in the security and defense field. These include the Center for Technology and National Security Policy at the National Defense University, the Center for Naval Warfare Studies at the Naval War College, and the Strategic Studies Institute at the U.S. Army War College.
The government funds, wholly or in part, activities at approximately 30 Federally Funded Research and Development Centers (FFRDCs). FFRDCs, are unique independent nonprofit entities sponsored and funded by the United States government to meet specific long-term technical needs that cannot be met by any other single organization.
FFRDCs typically assist government agencies with scientific research and analysis, systems development, and systems acquisition. They bring together the expertise and outlook of government, industry, and academia to solve complex technical problems.
These FFRDCs include:
- the RAND Corporation,
- the MITRE Corporation,
- the Institute for Defense Analyses,
- the Aerospace Corporation,
- the MIT Lincoln Laboratory,
- and other organizations supporting various departments within the United States Government.
Similar to the above quasi-governmental organizations are Federal Advisory Committees.
These groups, sometimes referred to as commissions, are a form of think tank dedicated to advising the US Presidents or the Executive branch of government. They typically focus on a specific issue and as such, might be considered similar to special interest groups.
However, unlike special interest groups these committees have come under some oversight regulation and are required to make formal records available to the public. Approximately 1,000 these advisory committees are described in the FACA searchable database.
See also:
- Collective intelligence
- Futurists
- Internet think tanks
- List of think tanks
- Lobbying
- Mass collaboration
- Mass communication
- Overton window
- Strategic studies
- TED (conference)
- Think Tanks and Civil Societies Program (TTCSP)
- The Economist Magazine and NPR's Marketplace report: "Under the Influence: Think Tanks and The Money That Fuels Them"
- Foreign Policy Research Institute, Think Tanks and Civil Societies Program directory of over 5000 think tanks and research on the role and impact of think tanks.
- PBS: Think Tank with Ben Wattenberg: "Thinking About Think Tanks" – interview with Christopher DeMuth, President of AEI, 13 October 2005
- Enrique Mendizabal (Former head of programme at the Overseas Development Institute on the definition of think tanks: towards a more useful discussion -a new way of studying think tanks that focuses on their functions rather than form.
- 2008–2016 Global Think Tank Rating
Southern Poverty Law Center (SPLC) including List of organizations designated by SPLC as hate groups
- YouTube Video: SPLC—The Year in Hate and Extremism
- YouTube Video: A black man goes undercover in the alt-right | Theo E.J. Wilson
- YouTube Video: Morris Dees - Southern Poverty Law Center | American Freedom Stories | Biography
The Southern Poverty Law Center (SPLC) is an American nonprofit legal advocacy organization specializing in civil rights and public interest litigation.
Based in Montgomery, Alabama, it is known for its legal cases against white supremacist groups, its classification of hate groups and other extremist organizations, and for promoting tolerance education programs.
The SPLC was founded by Morris Dees, Joseph J. Levin Jr., and Julian Bond in 1971 as a civil rights law firm in Montgomery, Alabama. Bond served as president of the board between 1971 and 1979.
In 1979, the SPLC began a litigation strategy of filing civil suits for monetary damages on behalf of the victims of violence from the Ku Klux Klan and other white supremacist groups, with all damages recovered given to the victims or donated to other organizations.
The SPLC also became involved in other civil rights causes, including cases to challenge what it sees as institutional racial segregation and discrimination, inhumane and unconstitutional conditions in prisons and detention centers, discrimination based on sexual orientation, mistreatment of illegal immigrants, and the unconstitutional mixing of church and state.
The SPLC has provided information about hate groups to the Federal Bureau of Investigation (FBI) and other law enforcement agencies.
Since the 2000s, the SPLC's classification and listings of hate groups (organizations it has assessed either "attack or malign an entire class of people, typically for their immutable characteristics") and extremists have often been described as authoritative and are widely accepted and cited in academic and media coverage of such groups and related issues.
The SPLC's listings have also been the subject of criticism from others, who argue that some of the SPLC's listings are overbroad, politically motivated, or unwarranted. There have also been accusations of misuse or unnecessarily extravagant use of funds by the organization, leading some employees to call the headquarters "Poverty Palace".
In 2019, founder Morris Dees was fired, which was followed by the resignation of president Richard Cohen. An outside consultant, Tina Tchen, was brought in to review workplace practices, particularly relating to accusations of racial and sexual harassment.
Click on any of the following blue hyperlinks for more about the Southern Poverty Law Center:
List of organizations designated by the Southern Poverty Law Center as hate groups:
The following is a list of U.S.-based organizations that are classified as hate groups by the Southern Poverty Law Center (SPLC). The SPLC is an American nonprofit legal advocacy organization specializing in civil rights and public interest litigation.
The SPLC defines a hate group as "an organization that — based on its official statements or principles, the statements of its leaders, or its activities — has beliefs or practices that attack or malign an entire class of people, typically for their immutable characteristic." The SPLC states that "Hate group activities can include criminal acts, marches, rallies, speeches, meetings, leafleting or publishing" and adds that inclusion on its hate-group list "does not imply that a group advocates or engages in violence or other criminal activity."
Since 1981, the SPLC's Intelligence Project has published a quarterly Intelligence Report, which monitors hate groups and extremist organizations in the United States. The SPLC began an annual census of hate groups in 1990. The SPLC listed 1,020 hate groups and hate-group chapters on its 2018 list—an all-time high fueled primarily by an increase in radical right groups.
The Intelligence Report provides information regarding the organizational efforts and tactics of these groups, and it is cited by a number of scholars as a reliable and comprehensive source on U.S. hate groups.
The SPLC also publishes the HateWatch Weekly newsletter, which documents racism and extremism, and the Hatewatch blog.
Pundits, politicians, and some of the designated groups have objected to the SPLC's list. For example, the Family Research Council disputed its designation in 2010, and the Center for Immigration Studies disputed the SPLC anti-immigrant designation in 2016. The SPLC's hate group listings have also been criticized by some political observers and prominent Republicans.
Historical trends:
In 1999, the SPLC listed 457 hate groups; that number steadily increased until 2011, when 1,018 groups were listed.
The rise from 2008 onward was attributed in part to anger at Barack Obama, the first black president of the United States. Thereafter, the number of hate groups steadily dropped, reaching a low of 784 in 2014 (a 23% drop).
However, between 2014 and 2018, the number of hate groups skyrocketed 30%, reaching 892 in 2015; 917 in 2016; 954 in 2017; and 1,020 in 2018.
According to Mark Potok at the SPLC, Donald Trump's presidential campaign speeches "demonizing statements about Latinos and Muslims have electrified the radical right, leading to glowing endorsements from white nationalist leaders such as Jared Taylor and former Klansman David Duke".
The relative strength of hate groups have varied over time; for example, the Ku Klux Klan has markedly declined, while other white supremacist groups have substantially strengthened.
In its 2019 annual report (covering the year 2018), the SPLC listed 1,020 organizations as active hate groups, categorized by type, as follows:
Click on any of the following blue hyperlinks for more about the List of organizations designated by the SPLC as Hate Groups:
Based in Montgomery, Alabama, it is known for its legal cases against white supremacist groups, its classification of hate groups and other extremist organizations, and for promoting tolerance education programs.
The SPLC was founded by Morris Dees, Joseph J. Levin Jr., and Julian Bond in 1971 as a civil rights law firm in Montgomery, Alabama. Bond served as president of the board between 1971 and 1979.
In 1979, the SPLC began a litigation strategy of filing civil suits for monetary damages on behalf of the victims of violence from the Ku Klux Klan and other white supremacist groups, with all damages recovered given to the victims or donated to other organizations.
The SPLC also became involved in other civil rights causes, including cases to challenge what it sees as institutional racial segregation and discrimination, inhumane and unconstitutional conditions in prisons and detention centers, discrimination based on sexual orientation, mistreatment of illegal immigrants, and the unconstitutional mixing of church and state.
The SPLC has provided information about hate groups to the Federal Bureau of Investigation (FBI) and other law enforcement agencies.
Since the 2000s, the SPLC's classification and listings of hate groups (organizations it has assessed either "attack or malign an entire class of people, typically for their immutable characteristics") and extremists have often been described as authoritative and are widely accepted and cited in academic and media coverage of such groups and related issues.
The SPLC's listings have also been the subject of criticism from others, who argue that some of the SPLC's listings are overbroad, politically motivated, or unwarranted. There have also been accusations of misuse or unnecessarily extravagant use of funds by the organization, leading some employees to call the headquarters "Poverty Palace".
In 2019, founder Morris Dees was fired, which was followed by the resignation of president Richard Cohen. An outside consultant, Tina Tchen, was brought in to review workplace practices, particularly relating to accusations of racial and sexual harassment.
Click on any of the following blue hyperlinks for more about the Southern Poverty Law Center:
- History
- Notable cases
- Alabama legislature
- Vietnamese fishermen
- White Patriot Party
- United Klans of America
- White Aryan Resistance
- Church of the Creator
- Christian Knights of the KKK
- Aryan Nations
- Ten Commandments monument
- Ranch Rescue
- Billy Ray Johnson
- Imperial Klans of America
- Mississippi correctional institutions
- Polk County Florida Sheriff
- Andrew Anglin and The Daily Stormer
- Projects
- Tracking of hate groups and extremists
- Controversies regarding hate group and extremist designations
- Finances
- See also:
- Media related to Southern Poverty Law Center at Wikimedia Commons
- Official website
List of organizations designated by the Southern Poverty Law Center as hate groups:
The following is a list of U.S.-based organizations that are classified as hate groups by the Southern Poverty Law Center (SPLC). The SPLC is an American nonprofit legal advocacy organization specializing in civil rights and public interest litigation.
The SPLC defines a hate group as "an organization that — based on its official statements or principles, the statements of its leaders, or its activities — has beliefs or practices that attack or malign an entire class of people, typically for their immutable characteristic." The SPLC states that "Hate group activities can include criminal acts, marches, rallies, speeches, meetings, leafleting or publishing" and adds that inclusion on its hate-group list "does not imply that a group advocates or engages in violence or other criminal activity."
Since 1981, the SPLC's Intelligence Project has published a quarterly Intelligence Report, which monitors hate groups and extremist organizations in the United States. The SPLC began an annual census of hate groups in 1990. The SPLC listed 1,020 hate groups and hate-group chapters on its 2018 list—an all-time high fueled primarily by an increase in radical right groups.
The Intelligence Report provides information regarding the organizational efforts and tactics of these groups, and it is cited by a number of scholars as a reliable and comprehensive source on U.S. hate groups.
The SPLC also publishes the HateWatch Weekly newsletter, which documents racism and extremism, and the Hatewatch blog.
Pundits, politicians, and some of the designated groups have objected to the SPLC's list. For example, the Family Research Council disputed its designation in 2010, and the Center for Immigration Studies disputed the SPLC anti-immigrant designation in 2016. The SPLC's hate group listings have also been criticized by some political observers and prominent Republicans.
Historical trends:
In 1999, the SPLC listed 457 hate groups; that number steadily increased until 2011, when 1,018 groups were listed.
The rise from 2008 onward was attributed in part to anger at Barack Obama, the first black president of the United States. Thereafter, the number of hate groups steadily dropped, reaching a low of 784 in 2014 (a 23% drop).
However, between 2014 and 2018, the number of hate groups skyrocketed 30%, reaching 892 in 2015; 917 in 2016; 954 in 2017; and 1,020 in 2018.
According to Mark Potok at the SPLC, Donald Trump's presidential campaign speeches "demonizing statements about Latinos and Muslims have electrified the radical right, leading to glowing endorsements from white nationalist leaders such as Jared Taylor and former Klansman David Duke".
The relative strength of hate groups have varied over time; for example, the Ku Klux Klan has markedly declined, while other white supremacist groups have substantially strengthened.
In its 2019 annual report (covering the year 2018), the SPLC listed 1,020 organizations as active hate groups, categorized by type, as follows:
- Ku Klux Klan (51),
- neo-Nazi (112),
- white nationalist (148),
- racist skinhead (63),
- Christian Identity (17),
- neo-Confederate (36),
- black nationalist (264),
- anti-immigrant (17),
- anti-LGBT (49),
- anti-Muslim (100),
- and "other hate" (163, consisting of :
- 15 hate music groups,
- 8 Holocaust denial groups,
- 2 male supremacy groups,
- 30 neo-Völkisch groups,
- 11 radical traditional Catholic groups,
- and 97 other groups).
Click on any of the following blue hyperlinks for more about the List of organizations designated by the SPLC as Hate Groups:
- Groups by type
- See also:
What does civics education look like in America? (Brookings Institution) as well as Civics, including Civic Education in the United States
- YouTube Video: The Importance of Civic Education - John B. King
- YouTube Video: Young people, the Internet and civic participation | Shakuntala Banaji | TEDxUHasselt
- YouTube Video: The social contract | Foundations of American democracy | US government and civics | Khan Academy
What does civics education look like in America?
by Elizabeth Mann Levesque, Nonresident Fellow - Governance Studies, Brown Center on Education Policy (7/23/2018)
"How well are schools preparing students to be effective citizens, voters, and members of their communities? This question seems more relevant than ever in the current era of contentious and polarized politics. Students recently earned national attention by organizing the March for Our Lives, a student-led demonstration against gun violence with marches occurring worldwide.
This surge of political activism by young people demonstrates a high capacity for political engagement among students. Yet at the same time, real concerns persist about the extent to which schools are equipping all students with the skills they need to be effective citizens, and whether some students will leave school more prepared than others.
In this context, the 2018 Brown Center Report on American Education focuses on the state of civics education in the U.S. Chapter 2 examines how states have incorporated certain practices into their requirements for civics education and uses survey data to assess whether student experiences reflect these practices. The data highlight how critical parts of a civics education, namely participatory elements and community engagement, are often missing from state requirements, whereas discussion and knowledge-building components appear more common.
WHAT CONSTITUTES A HIGH-QUALITY CIVICS EDUCATION?
As with almost any attempt to identify a set of “best” practices in education, we find different perspectives from different experts, with a research base too thin to offer unambiguous guidance. In this context, we turn to what appears to be as close as we could reasonably expect to a consensus view from experts—the Six Proven Practices (PP) for Effective Civic Learning framework.
Motivating this framework is a notion that teaching students facts about U.S. government is a goal, but not the exclusive goal, of civics education. The aim of civics education is broader and includes providing students with an understanding of how democratic processes work, as well as how to engage in these processes.
A high-quality civics education thus includes opportunities for students to engage in activities within the classroom that model what democratic processes look like, as well as opportunities to participate in the civic life of their communities and learn from this participation as a formal part of their coursework.
Reflecting this concept of what constitutes an effective civics education, the Proven Practices framework recommends that civics instruction include a set of practices that, together, provide students with the civic knowledge, skills, and dispositions that will equip them to participate in American democracy.
The PPs include six original practices (numbered 1-6 below) along with four recently proposed additions (numbered 7-10 below):
TAKING AN INVENTORY OF STATE STANDARDS WITH RESPECT TO CIVICS EDUCATION:
To what extent are states incorporating these practices into their civics standards and curricula?
Because state policies dictate the knowledge and skills schools are required to teach their students, we created a 50-state inventory that examines whether states have adopted a subset of the PPs in their high school course graduation requirements, state standards, and curricula.
We do not expect each of the 10 PPs to show up in these documents. For example, it is unlikely that states discuss extracurricular activities in their standards. Therefore, we focus on the practices most likely to be mentioned in graduation requirements, standards, and curricula:
Some states might have adopted rigorous civics standards in ways that escaped our view because their language does not align with the language of the PPs. This could be true, for example, of states that have used the College, Career, and Civic Life (C3) Framework for Social Studies State Standards in designing their standards or curricula.
Similar to the PPs, the C3 framework emphasizes both knowledge acquisition and active participation in civic life. Therefore, in our state inventory we also note whether or not a state has adopted the C3 framework.
Of the PPs we examined, we found that the most common practices are classroom instruction, knowledge building, and discussion-based activities. These are far more common than participatory elements of learning or community engagement. For example, every state mentions discussion of current events in its standards or curriculum frameworks, and 42 states and Washington, D.C., require at least one course related to civics education. In contrast, just over half of states (26, plus Washington, D.C.) mention simulations of democratic processes or procedures, while only 11 states include service learning (a less strict definition of service learning brings this total to 20).
The lack of participatory elements of learning in state accountability frameworks highlights a void in civics education, as experts indicate that a high-quality civics education is incomplete without teaching students what civic participation looks like in practice, and how citizens can engage in their communities.
THE STUDENT EXPERIENCE:
In addition to this analysis of state policy, we explore an important aspect of civics education: the student experience. Using data from the nationally representative 2010 National Assessment of Educational Progress (NAEP) student survey on civics education, we look at the types of activities students report engaging in through their civics coursework.
Similar to the policy inventory, students’ self-reported experiences reflect an emphasis on in-class, discussion-based civics education. Figure 1 illustrates that discussion of current events occurs regularly, whereas opportunities for community engagement and participation in simulations of democratic procedures occur considerably less frequently.
LOOKING FORWARD: IMPROVING CIVICS EDUCATION FOR U.S. STUDENTS
Our analysis of state policy and self-reported student experiences indicates that most states do include important aspects of a quality civics education in their standards and curricula. However, there is room to grow in incorporating more participatory components of civics education into students’ experiences.
Fortunately, a growing body of research on civics education and an increasing number of resources, such as the C3 framework, are available to help states and educators provide their students with a well-rounded civics experience.
For more in-depth analysis of civics education, and what the education community can do to ensure that today’s students are prepared to be tomorrow’s citizens, see the 2018 Brown Center Report on American Education.
[End of Brookings Article]
___________________________________________________________________________
Below: Wikipedia articles for "Civics, and "Civics Education in the United States"
Civics derives from the French word civique, meaning citizen, and the Latin, civic, a garland of oak leaves worn about the head as a crown, given in reward of those who saved another citizen from death. Civics relates to behavior affecting other citizens, particularly in the context of urban development.
Civic education is the study of the theoretical, political and practical aspects of citizenship, as well as its rights and duties. It includes the study of civil law and civil code, and the study of government with attention to the role of citizens―as opposed to external factors―in the operation and oversight of government.
Civic education in the United States:
Rationale for civic education in the United States:
The promotion of a republic and its values has been an important concern for policy-makers – to impact people´s political perceptions, to encourage political participation, and to foster the principles enshrined in the Constitution (e.g. liberty, freedom of speech, civil rights).
The subject of “Civics” has been integrated into the Curriculum and Content Standards, to enhance the comprehension of democratic values in the educational system. Civic literature has found that “engaging young children in civic activities from an early age is a positive predictor of their participation in later civic life”.
As an academic subject, Civics has the instructional objective to promote knowledge that is aligned with self-governance and participation in matters of public concern. These objectives advocate for an instruction that encourages active student participation in democratic decision-making environments, such as voting to elect a course representative for a school government, or deciding on actions that will affect the school environment or community.
Thus the intersection of individual and collective decision making activities, are critical to shape “individual´s moral development”. To reach those goals, civic instructors must promote the adoption of certain skills and attitudes such as “respectful argumentation, debate, information literacy”, to support “the development of morally responsible individuals who will shape a morally responsible and civically minded society".
In the 21st century, young people are less interested in direct political participation (i.e. being in a political party or even voting), but are motivated to use digital media (e.g. Twitter, Facebook). Digital media enable young people to share and exchange ideas rapidly, enabling the coordination of local communities that promote volunteerism and political activism, in topics principally related to human rights and environmental subjects.
Young people are constructing and supporting their political identities in the 21st century by using social media, and digital tools (e.g. text messaging, hashtags, videos) to share, post, reply an opinion or attitude about a political/social topic and to promote social mobilization and support through online mechanism to a wide and diverse audience.
Therefore, civics' end-goal in the 21st century must be oriented to “empower the learners to find issues in their immediate communities that seem important to the people with whom they live and associate”, once “learners have identified with a personal issue and participated in constructing a collective framing for common issues”.
Current state of civic education in the United States:
According to the No Child Left Behind Act of 2001, one of the purposes of Civic Education is to “foster civic competence and responsibility” which is promoted through the Center for Civic Education’s We the People and Project Citizen initiatives.
However, there is a lack of consensus for how this mission should be pursued. The Center for Information & Research on Civic Learning & Engagement (CIRCLE) reviewed state civic education requirements in the United States for 2012. The findings include:
The lack of state-mandated student accountability relating to civics may be a result of a shift in emphasis towards reading and mathematics in response to the 2001 No Child Left Behind Act. There is a movement to require that states utilize the citizenship test as a graduation requirement, but this is seen as a controversial solution to some educators.
Students are also demonstrating that their civic knowledge leaves much to be desired. A National Center for Education Statistics NAEP report card for civics (2010) stated that “levels of civic knowledge in U.S. have remained unchanged or even declined over the past century”. Specifically, only 24 percent of 4th, 8th, and 12th graders were at or above the proficient level on the National Assessment of Educational Progress in civics.
Traditionally, civic education has emphasized the facts of government processes detached from participatory experience. In an effort to combat the existing approach, the National Council for the Social Studies developed the College, Career, and Civic Life (C3)
Framework for Social Studies State Standards. The C3 Framework emphasizes “new and active approaches” including the “discussion of controversial issues and current events, deliberation of public issues, service-learning, action civics, participation in simulation and role play, and the use of digital technologies”.
Civic education in the United States in the 21st century:
According to a study conducted by the Pew Research Center, among teens 12–17 years old, 95% have access to the Internet, 70% go online daily, 80% use social networking sites, and 77% have cell phones.
As a result, participatory culture has become a staple for today’s youth, affecting their conceptualization of civic participation. They use Web 2.0 tools (i.e. blogs, podcasts, wikis, social media) to: circulate information (blogs and podcasts); collaborate with peers (wikis); produce and exchange media; and connect with people around the world via social media and online communities.
The pervasiveness of participatory digital tools has led to a shift in the way adolescents today perceive civic action and participation. Whereas 20th century civic education embraced the belief of “dutiful citizenship” and civic engagement as a “matter of duty or obligation;” 21st century civic education has shifted to reflect youths' “personally expressive politics” and “peer-to-peer relationships” that promote civic engagement.
This shift in students' perceptions has led to classroom civic education experiences that reflect the digital world in which 21st century youth now live, in order to make the content both relevant and meaningful.
Civics education classrooms in the 21st century now seek to provide genuine opportunities to actively engage in the consumption, circulation, discussion, and production of civic and political content via Web 2.0 technologies such as blogging, wikis, and social media.
Although these tools offer new ways for engagement, interaction, and dialogue, educators have also recognized the need to teach youth how to interact both respectfully and productively with their peers and members of online communities.
As a result, many school districts have also begun adopting Media Literacy Frameworks for Engaged Citizenship as a pedagogical approach to prepare students for active participatory citizenship in today’s digital age. This model includes critical analysis of digital media as well as a deep understanding of media literacy as a “collaborative and participatory movement that aims to empower individuals to have a voice and to use it.
Criticism of civic education:
Sudbury schools contend that values, social justice and democracy must be learned through experience as Aristotle said: "For the things we have to learn before we can do them, we learn by doing them." They adduce that for this purpose schools must encourage ethical behavior and personal responsibility.
In order to achieve these goals schools must allow students the three great freedoms—freedom of choice, freedom of action and freedom to bear the results of action—that constitute personal responsibility. The "strongest, political rationale" for democratic schools is that they teach "the virtues of democratic deliberation for the sake of future citizenship."
This type of education is often alluded to in the deliberative democracy literature as fulfilling the necessary and fundamental social and institutional changes necessary to develop a democracy that involves intensive participation in group decision making, negotiation, and social life of consequence.
See also:
by Elizabeth Mann Levesque, Nonresident Fellow - Governance Studies, Brown Center on Education Policy (7/23/2018)
"How well are schools preparing students to be effective citizens, voters, and members of their communities? This question seems more relevant than ever in the current era of contentious and polarized politics. Students recently earned national attention by organizing the March for Our Lives, a student-led demonstration against gun violence with marches occurring worldwide.
This surge of political activism by young people demonstrates a high capacity for political engagement among students. Yet at the same time, real concerns persist about the extent to which schools are equipping all students with the skills they need to be effective citizens, and whether some students will leave school more prepared than others.
In this context, the 2018 Brown Center Report on American Education focuses on the state of civics education in the U.S. Chapter 2 examines how states have incorporated certain practices into their requirements for civics education and uses survey data to assess whether student experiences reflect these practices. The data highlight how critical parts of a civics education, namely participatory elements and community engagement, are often missing from state requirements, whereas discussion and knowledge-building components appear more common.
WHAT CONSTITUTES A HIGH-QUALITY CIVICS EDUCATION?
As with almost any attempt to identify a set of “best” practices in education, we find different perspectives from different experts, with a research base too thin to offer unambiguous guidance. In this context, we turn to what appears to be as close as we could reasonably expect to a consensus view from experts—the Six Proven Practices (PP) for Effective Civic Learning framework.
Motivating this framework is a notion that teaching students facts about U.S. government is a goal, but not the exclusive goal, of civics education. The aim of civics education is broader and includes providing students with an understanding of how democratic processes work, as well as how to engage in these processes.
A high-quality civics education thus includes opportunities for students to engage in activities within the classroom that model what democratic processes look like, as well as opportunities to participate in the civic life of their communities and learn from this participation as a formal part of their coursework.
Reflecting this concept of what constitutes an effective civics education, the Proven Practices framework recommends that civics instruction include a set of practices that, together, provide students with the civic knowledge, skills, and dispositions that will equip them to participate in American democracy.
The PPs include six original practices (numbered 1-6 below) along with four recently proposed additions (numbered 7-10 below):
- Classroom instruction in civics, government, history, law, economics, and geography
- Discussion of current events
- Service learning
- Extracurricular activities
- Student participation in school governance
- Simulations of democratic processes and procedures
- News media literacy
- Action civics
- Social-emotional learning (SEL)
- School climate reform
TAKING AN INVENTORY OF STATE STANDARDS WITH RESPECT TO CIVICS EDUCATION:
To what extent are states incorporating these practices into their civics standards and curricula?
Because state policies dictate the knowledge and skills schools are required to teach their students, we created a 50-state inventory that examines whether states have adopted a subset of the PPs in their high school course graduation requirements, state standards, and curricula.
We do not expect each of the 10 PPs to show up in these documents. For example, it is unlikely that states discuss extracurricular activities in their standards. Therefore, we focus on the practices most likely to be mentioned in graduation requirements, standards, and curricula:
- classroom instruction (PP 1),
- discussion of current events (PP 2),
- service learning (PP 3),
- simulations of democratic processes and procedures (PP 6),
- and news media literacy (PP 7).
Some states might have adopted rigorous civics standards in ways that escaped our view because their language does not align with the language of the PPs. This could be true, for example, of states that have used the College, Career, and Civic Life (C3) Framework for Social Studies State Standards in designing their standards or curricula.
Similar to the PPs, the C3 framework emphasizes both knowledge acquisition and active participation in civic life. Therefore, in our state inventory we also note whether or not a state has adopted the C3 framework.
Of the PPs we examined, we found that the most common practices are classroom instruction, knowledge building, and discussion-based activities. These are far more common than participatory elements of learning or community engagement. For example, every state mentions discussion of current events in its standards or curriculum frameworks, and 42 states and Washington, D.C., require at least one course related to civics education. In contrast, just over half of states (26, plus Washington, D.C.) mention simulations of democratic processes or procedures, while only 11 states include service learning (a less strict definition of service learning brings this total to 20).
The lack of participatory elements of learning in state accountability frameworks highlights a void in civics education, as experts indicate that a high-quality civics education is incomplete without teaching students what civic participation looks like in practice, and how citizens can engage in their communities.
THE STUDENT EXPERIENCE:
In addition to this analysis of state policy, we explore an important aspect of civics education: the student experience. Using data from the nationally representative 2010 National Assessment of Educational Progress (NAEP) student survey on civics education, we look at the types of activities students report engaging in through their civics coursework.
Similar to the policy inventory, students’ self-reported experiences reflect an emphasis on in-class, discussion-based civics education. Figure 1 illustrates that discussion of current events occurs regularly, whereas opportunities for community engagement and participation in simulations of democratic procedures occur considerably less frequently.
LOOKING FORWARD: IMPROVING CIVICS EDUCATION FOR U.S. STUDENTS
Our analysis of state policy and self-reported student experiences indicates that most states do include important aspects of a quality civics education in their standards and curricula. However, there is room to grow in incorporating more participatory components of civics education into students’ experiences.
Fortunately, a growing body of research on civics education and an increasing number of resources, such as the C3 framework, are available to help states and educators provide their students with a well-rounded civics experience.
For more in-depth analysis of civics education, and what the education community can do to ensure that today’s students are prepared to be tomorrow’s citizens, see the 2018 Brown Center Report on American Education.
[End of Brookings Article]
___________________________________________________________________________
Below: Wikipedia articles for "Civics, and "Civics Education in the United States"
Civics derives from the French word civique, meaning citizen, and the Latin, civic, a garland of oak leaves worn about the head as a crown, given in reward of those who saved another citizen from death. Civics relates to behavior affecting other citizens, particularly in the context of urban development.
Civic education is the study of the theoretical, political and practical aspects of citizenship, as well as its rights and duties. It includes the study of civil law and civil code, and the study of government with attention to the role of citizens―as opposed to external factors―in the operation and oversight of government.
Civic education in the United States:
Rationale for civic education in the United States:
The promotion of a republic and its values has been an important concern for policy-makers – to impact people´s political perceptions, to encourage political participation, and to foster the principles enshrined in the Constitution (e.g. liberty, freedom of speech, civil rights).
The subject of “Civics” has been integrated into the Curriculum and Content Standards, to enhance the comprehension of democratic values in the educational system. Civic literature has found that “engaging young children in civic activities from an early age is a positive predictor of their participation in later civic life”.
As an academic subject, Civics has the instructional objective to promote knowledge that is aligned with self-governance and participation in matters of public concern. These objectives advocate for an instruction that encourages active student participation in democratic decision-making environments, such as voting to elect a course representative for a school government, or deciding on actions that will affect the school environment or community.
Thus the intersection of individual and collective decision making activities, are critical to shape “individual´s moral development”. To reach those goals, civic instructors must promote the adoption of certain skills and attitudes such as “respectful argumentation, debate, information literacy”, to support “the development of morally responsible individuals who will shape a morally responsible and civically minded society".
In the 21st century, young people are less interested in direct political participation (i.e. being in a political party or even voting), but are motivated to use digital media (e.g. Twitter, Facebook). Digital media enable young people to share and exchange ideas rapidly, enabling the coordination of local communities that promote volunteerism and political activism, in topics principally related to human rights and environmental subjects.
Young people are constructing and supporting their political identities in the 21st century by using social media, and digital tools (e.g. text messaging, hashtags, videos) to share, post, reply an opinion or attitude about a political/social topic and to promote social mobilization and support through online mechanism to a wide and diverse audience.
Therefore, civics' end-goal in the 21st century must be oriented to “empower the learners to find issues in their immediate communities that seem important to the people with whom they live and associate”, once “learners have identified with a personal issue and participated in constructing a collective framing for common issues”.
Current state of civic education in the United States:
According to the No Child Left Behind Act of 2001, one of the purposes of Civic Education is to “foster civic competence and responsibility” which is promoted through the Center for Civic Education’s We the People and Project Citizen initiatives.
However, there is a lack of consensus for how this mission should be pursued. The Center for Information & Research on Civic Learning & Engagement (CIRCLE) reviewed state civic education requirements in the United States for 2012. The findings include:
- All 50 states have social studies standards which include civics and government.
- 40 states require at least one course in government/civics.
- 21 states require a state-mandated social studies test which is a decrease from 2001 (34 states).
- 8 states require students to take a state-mandated government/civics test.
- 9 states require a social studies test as a requirement for high school graduation.
The lack of state-mandated student accountability relating to civics may be a result of a shift in emphasis towards reading and mathematics in response to the 2001 No Child Left Behind Act. There is a movement to require that states utilize the citizenship test as a graduation requirement, but this is seen as a controversial solution to some educators.
Students are also demonstrating that their civic knowledge leaves much to be desired. A National Center for Education Statistics NAEP report card for civics (2010) stated that “levels of civic knowledge in U.S. have remained unchanged or even declined over the past century”. Specifically, only 24 percent of 4th, 8th, and 12th graders were at or above the proficient level on the National Assessment of Educational Progress in civics.
Traditionally, civic education has emphasized the facts of government processes detached from participatory experience. In an effort to combat the existing approach, the National Council for the Social Studies developed the College, Career, and Civic Life (C3)
Framework for Social Studies State Standards. The C3 Framework emphasizes “new and active approaches” including the “discussion of controversial issues and current events, deliberation of public issues, service-learning, action civics, participation in simulation and role play, and the use of digital technologies”.
Civic education in the United States in the 21st century:
According to a study conducted by the Pew Research Center, among teens 12–17 years old, 95% have access to the Internet, 70% go online daily, 80% use social networking sites, and 77% have cell phones.
As a result, participatory culture has become a staple for today’s youth, affecting their conceptualization of civic participation. They use Web 2.0 tools (i.e. blogs, podcasts, wikis, social media) to: circulate information (blogs and podcasts); collaborate with peers (wikis); produce and exchange media; and connect with people around the world via social media and online communities.
The pervasiveness of participatory digital tools has led to a shift in the way adolescents today perceive civic action and participation. Whereas 20th century civic education embraced the belief of “dutiful citizenship” and civic engagement as a “matter of duty or obligation;” 21st century civic education has shifted to reflect youths' “personally expressive politics” and “peer-to-peer relationships” that promote civic engagement.
This shift in students' perceptions has led to classroom civic education experiences that reflect the digital world in which 21st century youth now live, in order to make the content both relevant and meaningful.
Civics education classrooms in the 21st century now seek to provide genuine opportunities to actively engage in the consumption, circulation, discussion, and production of civic and political content via Web 2.0 technologies such as blogging, wikis, and social media.
Although these tools offer new ways for engagement, interaction, and dialogue, educators have also recognized the need to teach youth how to interact both respectfully and productively with their peers and members of online communities.
As a result, many school districts have also begun adopting Media Literacy Frameworks for Engaged Citizenship as a pedagogical approach to prepare students for active participatory citizenship in today’s digital age. This model includes critical analysis of digital media as well as a deep understanding of media literacy as a “collaborative and participatory movement that aims to empower individuals to have a voice and to use it.
Criticism of civic education:
Sudbury schools contend that values, social justice and democracy must be learned through experience as Aristotle said: "For the things we have to learn before we can do them, we learn by doing them." They adduce that for this purpose schools must encourage ethical behavior and personal responsibility.
In order to achieve these goals schools must allow students the three great freedoms—freedom of choice, freedom of action and freedom to bear the results of action—that constitute personal responsibility. The "strongest, political rationale" for democratic schools is that they teach "the virtues of democratic deliberation for the sake of future citizenship."
This type of education is often alluded to in the deliberative democracy literature as fulfilling the necessary and fundamental social and institutional changes necessary to develop a democracy that involves intensive participation in group decision making, negotiation, and social life of consequence.
See also:
- Legal education in the United States
- Acculturation
- Citizenship education (subject)
- Civic engagement
- Community
- Digital civics
- Etiquette
- Global civics
- History of citizenship
- Index of civics articles
- Law and order
- Law
- Legal awareness
- Legal socialisation
- Participation (decision making)
- Political Science
- Public space
- Socialisation
- Spatial Citizenship
- Voting
Citizens for Responsibility and Ethics in Washington, including US Office of Government Ethics
- YouTube Video: Donald Trump's conflicts of interest span the globe
- YouTube Video: IMHO: The Trump Family’s Conflicts Of Interest
- YouTube Video: President Trump Disgrace Laid Bare In Interactions With Gold Star Families | Rachel Maddow | MSNBC
Trump's 2,000 Conflicts of Interest (and Counting):
UNPRECEDENTED CONFLICTS OF INTEREST: During his run for president, Donald Trump promised that he would “drain the swamp” in Washington by rooting out corruption and wrenching power from lobbyists and special interests.
In the two and a half years since he assumed the office of President of the United States, he has done exactly the opposite: placing former lobbyists in positions of power while giving foreign governments and special interests the opportunity to purchase access to his administration by patronizing his businesses. Instead of limiting Washington corruption, President Trump has pushed it into uncharted territory, innovating forms of corruption.
Click on any of the following blue hyperlinks for more about Donald Trump's Conflicts of Interest as reported by Citizens for Responsibility and Ethics in Washington (CREW):
Prior to President Trump, every modern president divested their business interests before entering office. For decades, this norm of presidential conduct has served as an important signal for both Republican and Democratic administrations to show that, as the nation’s most powerful and prominent public servant, the president would not put personal financial interests before the interests of the country.
Divestiture also served as an assurance to the public that the president would not open himself up to undue influence from special interests and foreign governments that might use his businesses as a way to curry favor with him and his administration.
President Trump rejected these principles, and as a result, he has created an environment where the actions of his administration, and those trying to influence it, can benefit the globe-spanning real-estate and branding empire that he still profits from.
The conflicts of interest that President Trump created by retaining his business interests are at some level immeasurable. There is no comprehensive financial filing requirement, for example, that the public or Congress can use to effectively identify these contacts and the administration has refused any good-faith efforts to provide the public with useful information, whether White House visitor logs or the president’s tax returns.
But that hasn’t kept CREW from relentlessly tracking what information is available through public sources and compiling it in a database of President Trump’s conflicts of interest. Since he took office two and a half years ago, CREW has used news reports, social media postings, FOIA responses, a newsletter devoted to monitoring his D.C. hotel, and other information to catalogue every identifiable interaction between the Trump Organization and the government and between the Trump Organization and those trying to influence the Trump administration.
Each of those results in a conflict of interest, between the public interest and Donald Trump’s personal financial interest.
TRUMP CONFLICTS OF INTEREST BY THE NUMBERS (CREW) has now tallied 2,310 conflicts resulting from President Trump’s decision to retain his business interests. Here are some of the most startling numbers from the data:
Click here for more about Trump's 2,000 Conflicts of Interest (and Counting)
___________________________________________________________________________
Citizens for Responsibility and Ethics in Washington
Citizens for Responsibility and Ethics in Washington (CREW) is a nonprofit 501(c)(3) and nonpartisan U.S. government ethics and accountability watchdog organization.
Founded in 2003 as a counterweight to conservative government watchdog groups such as Judicial Watch, CREW works to expose ethics violations and corruption by government officials and institutions and to reduce the role of money in politics.
Its activities include investigating, reporting and litigating government misconduct, requesting and forcing government information disclosure through FOIA requests, and filing congressional ethics complaints against individuals, institutions and agencies.
Its projects have included the publication of "CREW's Most Corrupt Members of Congress", an annual report in which CREW lists the people it determines to be the Federal government of the United States's most corrupt politicians. From 2005 and 2014 the annual reports named 25 Democrats and 63 Republicans.
Since liberal Democratic operative David Brock became CREW's chairman in 2014, CREW has almost exclusively pursued investigations and litigation against Republicans and conservatives. Brock stepped down as the group's chairman in 2016 and was replaced by Richard Painter, who went on to take a leave of absence to run as a Democrat in Minnesota's 2018 U.S. Senate special election.
Under Painter's leadership, CREW has pursued aggressive litigation against the Trump administration, and as of January 2018 had filed 180 lawsuits against what it called the "most unethical presidency" in U.S. history.
Click on any of the following blue hyperlinks for more about Citizens for Responsibility and Ethics in Washington:
United States Office of Government Ethics
The United States Office of Government Ethics (OGE) is an independent agency within the executive branch of the U.S. Federal Government which is responsible for directing executive branch policies relating to the prevention of conflict of interest on the part of Federal executive branch officers and employees.
Under the Ethics in Government Act, this agency was originally part of the Office of Personnel Management from 1978 until it separated in 1989.
Primary duties:
The main duties of OGE include the following:
Office of Director:
The Director of OGE is appointed by the President after confirmation by the U.S. Senate.
The Director of OGE serves a five-year term, thereby overlapping presidential terms, and is subject to no term limit. The rest of the OGE employees are career civil servants. Created by the Ethics in Government Act of 1978, OGE separated from the U.S. Office of Personnel Management in 1989 pursuant to reform legislation.
Emory Rounds is the current Director of the OGE, having been sworn into office on August 2, 2018.
Issues involving President Trump:
A series of tweets on 30 November 2016 from the office's official Twitter account praised President-elect Donald Trump for planning to divest his business holdings in order to resolve potential conflicts of interest, following an announcement where Trump reaffirmed his intent to take himself out of business operations, despite him having made no firm commitment to a divestment like selling his businesses or a blind trust.
A number of observers speculated that the office's account might have been hacked, a suggestion it later denied. The New York Times suggested that the apparent misunderstanding behind the postings were deliberately intended to reveal the independent agency had advised Trump's legal counsel that a divestment was the only adequate remedy for resolving any conflict, and, by extension, pressure Trump into doing so.
A Freedom of Information Act request by news organization The Daily Dot revealed that OGE Director Walter M. Shaub personally ordered officials within the agency to post the nine tweets.
Under the Trump Administration, the Office reversed its own internal policy prohibiting anonymous donations from lobbyists to White House staffers who have legal defense funds.
OGE certification of Ethics Agreement Compliance Form
On May 11, 2017, the Office of Government Ethics requested the Trump administration and its associates submit a form regarding divestment of assets and possible conflicts of interest.
List of Directors of the OGE:
See also:
UNPRECEDENTED CONFLICTS OF INTEREST: During his run for president, Donald Trump promised that he would “drain the swamp” in Washington by rooting out corruption and wrenching power from lobbyists and special interests.
In the two and a half years since he assumed the office of President of the United States, he has done exactly the opposite: placing former lobbyists in positions of power while giving foreign governments and special interests the opportunity to purchase access to his administration by patronizing his businesses. Instead of limiting Washington corruption, President Trump has pushed it into uncharted territory, innovating forms of corruption.
Click on any of the following blue hyperlinks for more about Donald Trump's Conflicts of Interest as reported by Citizens for Responsibility and Ethics in Washington (CREW):
- BY THE NUMBERS
- VISITS TO TRUMP PROPERTIES
- POLITICAL EVENTS AND SPENDING AT TRUMP PROPERTIES
- BLURRING THE LINE WITH THE WHITE HOUSE
- INTERNATIONAL TRAVEL AND BUSINESS
Prior to President Trump, every modern president divested their business interests before entering office. For decades, this norm of presidential conduct has served as an important signal for both Republican and Democratic administrations to show that, as the nation’s most powerful and prominent public servant, the president would not put personal financial interests before the interests of the country.
Divestiture also served as an assurance to the public that the president would not open himself up to undue influence from special interests and foreign governments that might use his businesses as a way to curry favor with him and his administration.
President Trump rejected these principles, and as a result, he has created an environment where the actions of his administration, and those trying to influence it, can benefit the globe-spanning real-estate and branding empire that he still profits from.
The conflicts of interest that President Trump created by retaining his business interests are at some level immeasurable. There is no comprehensive financial filing requirement, for example, that the public or Congress can use to effectively identify these contacts and the administration has refused any good-faith efforts to provide the public with useful information, whether White House visitor logs or the president’s tax returns.
But that hasn’t kept CREW from relentlessly tracking what information is available through public sources and compiling it in a database of President Trump’s conflicts of interest. Since he took office two and a half years ago, CREW has used news reports, social media postings, FOIA responses, a newsletter devoted to monitoring his D.C. hotel, and other information to catalogue every identifiable interaction between the Trump Organization and the government and between the Trump Organization and those trying to influence the Trump administration.
Each of those results in a conflict of interest, between the public interest and Donald Trump’s personal financial interest.
TRUMP CONFLICTS OF INTEREST BY THE NUMBERS (CREW) has now tallied 2,310 conflicts resulting from President Trump’s decision to retain his business interests. Here are some of the most startling numbers from the data:
- The president has visited his properties 362 times at taxpayer expense during his administration, sometimes visiting more than one of them in a single day. In 2019 alone, he has visited his properties 81 times, helping to further establish them as centers of political power. The number of days where President Trump has spent time at a Trump-branded property account for almost a third of the days he’s been president.
- One-hundred eleven officials from 65 foreign governments, including 57 foreign countries, have made 137 visits to a Trump property, raising the question of how much foreign money has been spent at Trump’s properties.
- Additionally, CREW has recorded 630 visits to Trump properties from at least 250 Trump administration officials. This includes high-level White House staff, members of Trump’s cabinet, and individual agency employees. So far this year, CREW has recorded 198 visits by White House officials. Ivanka Trump—who has an ownership interest in the Trump hotel in D.C.--and her husband Jared Kushner, both senior White House advisors, are the most frequent executive branch officials to visit Trump properties, other than the president himself. Jared has made 28 known visits, while Ivanka has made 23.
- Members of Congress have flocked to President Trump’s properties, despite their constitutional oversight responsibility to provide a check on the executive branch as it relates to President Trump’s conflicts of interest. Throughout his two and a half years as president, 90 members of Congress have made 188 visits to a Trump property.
- Forty-seven state officials, including 20 Republican governors, have made 64 visits to Trump properties, sometimes resulting in state taxpayer funds being spent there.
- President Trump has used the presidency to provide free publicity for his properties, which he still profits from as president. Over the course of his presidency, Trump has tweeted about or mentioned one of his properties on 159 occasions, and White House officials have followed suit: Members of Trump’s White House have mentioned a Trump property 65 times, sometimes in the course of their official duties.
- Political groups have hosted 63 events at Trump properties since President Trump took office, selling wealthy donors access to the administration while also enriching the president. Seventeen of these have been for Trump-linked groups, and another six have been hosted by groups linked to Vice President Mike Pence. Trump Victory, the joint fundraising arm of Trump’s 2020 election committee and the RNC, has hosted six events at Trump properties just this year, four of which were attended by the President himself. In all, the RNC and other Republican Party groups have had 28 events at Trump properties.
- Twenty Trump administration officials have attended 38 political events at a Trump property, giving wealthy donors who fund spending at the president’s businesses access to top officials to discuss their pet issues while they enrich President Trump personally.
- Political groups have spent $5.9 million at Trump properties since President Trump took office. So far this year, political groups have spent $1.1 million at Trump properties. In more than a decade prior to his run for president, Trump’s businesses never received more than $100,000 from political groups in a single year.
- The Trump Hotel in Washington, D.C. is the top beneficiary of this political spending. In just over two and a half years, the hotel has raked in $2.4 million in traceable political spending.
- Foreign governments and foreign government-linked organizations have hosted 12 events at Trump properties since the president took office. These events have been attended by at least 19 administration officials.
Click here for more about Trump's 2,000 Conflicts of Interest (and Counting)
___________________________________________________________________________
Citizens for Responsibility and Ethics in Washington
Citizens for Responsibility and Ethics in Washington (CREW) is a nonprofit 501(c)(3) and nonpartisan U.S. government ethics and accountability watchdog organization.
Founded in 2003 as a counterweight to conservative government watchdog groups such as Judicial Watch, CREW works to expose ethics violations and corruption by government officials and institutions and to reduce the role of money in politics.
Its activities include investigating, reporting and litigating government misconduct, requesting and forcing government information disclosure through FOIA requests, and filing congressional ethics complaints against individuals, institutions and agencies.
Its projects have included the publication of "CREW's Most Corrupt Members of Congress", an annual report in which CREW lists the people it determines to be the Federal government of the United States's most corrupt politicians. From 2005 and 2014 the annual reports named 25 Democrats and 63 Republicans.
Since liberal Democratic operative David Brock became CREW's chairman in 2014, CREW has almost exclusively pursued investigations and litigation against Republicans and conservatives. Brock stepped down as the group's chairman in 2016 and was replaced by Richard Painter, who went on to take a leave of absence to run as a Democrat in Minnesota's 2018 U.S. Senate special election.
Under Painter's leadership, CREW has pursued aggressive litigation against the Trump administration, and as of January 2018 had filed 180 lawsuits against what it called the "most unethical presidency" in U.S. history.
Click on any of the following blue hyperlinks for more about Citizens for Responsibility and Ethics in Washington:
- History
- Mission statements
- Activities
- Personnel
- Allegations of partisanship
- Funding
- See also:
- Citizens for Responsibility and Ethics in Washington, official site
- "Citizens for Responsibility and Ethics in Washington Internal Revenue Service filings". ProPublica Nonprofit Explorer.
- Campaign for Accountability
- Center for Effective Government
- Government Accountability Project
- Project On Government Oversight
United States Office of Government Ethics
The United States Office of Government Ethics (OGE) is an independent agency within the executive branch of the U.S. Federal Government which is responsible for directing executive branch policies relating to the prevention of conflict of interest on the part of Federal executive branch officers and employees.
Under the Ethics in Government Act, this agency was originally part of the Office of Personnel Management from 1978 until it separated in 1989.
Primary duties:
The main duties of OGE include the following:
- Establishing the standards of conduct for the executive branch;
- Issuing rules and regulations interpreting the criminal conflict of interest restrictions;
- Establishing the framework for the public and confidential financial disclosure systems for executive branch employees;
- Developing training and education programs for use by executive branch ethics officials and employees;
- Ensuring that individual agency ethics programs are functioning properly by setting the requirements for them, supporting them, and reviewing them.
Office of Director:
The Director of OGE is appointed by the President after confirmation by the U.S. Senate.
The Director of OGE serves a five-year term, thereby overlapping presidential terms, and is subject to no term limit. The rest of the OGE employees are career civil servants. Created by the Ethics in Government Act of 1978, OGE separated from the U.S. Office of Personnel Management in 1989 pursuant to reform legislation.
Emory Rounds is the current Director of the OGE, having been sworn into office on August 2, 2018.
Issues involving President Trump:
A series of tweets on 30 November 2016 from the office's official Twitter account praised President-elect Donald Trump for planning to divest his business holdings in order to resolve potential conflicts of interest, following an announcement where Trump reaffirmed his intent to take himself out of business operations, despite him having made no firm commitment to a divestment like selling his businesses or a blind trust.
A number of observers speculated that the office's account might have been hacked, a suggestion it later denied. The New York Times suggested that the apparent misunderstanding behind the postings were deliberately intended to reveal the independent agency had advised Trump's legal counsel that a divestment was the only adequate remedy for resolving any conflict, and, by extension, pressure Trump into doing so.
A Freedom of Information Act request by news organization The Daily Dot revealed that OGE Director Walter M. Shaub personally ordered officials within the agency to post the nine tweets.
Under the Trump Administration, the Office reversed its own internal policy prohibiting anonymous donations from lobbyists to White House staffers who have legal defense funds.
OGE certification of Ethics Agreement Compliance Form
On May 11, 2017, the Office of Government Ethics requested the Trump administration and its associates submit a form regarding divestment of assets and possible conflicts of interest.
List of Directors of the OGE:
- J. Jackson Walter, 1979–1982
- David H. Martin, 1983–1987
- Frank Q. Nebeker, 1987–1989
- Stephen D. Potts, 1990–2000
- Amy Comstock, 2000–2003
- Robert Cusick, 2006−2011
- Walter Shaub, 2013–2017
- David J. Apol, 2017–2018 (acting)
- Emory A. Rounds III, 2018–Prsent
See also:
Citizens United Organization vs. End Citizens United Movement
- YouTube Video: Citizens United v. FEC | BRI's Homework Help Series
- YouTube Video: End Citizens United by Ingenuity
- YouTube Video: Our democracy no longer represents the people. Here's how we fix it | Larry Lessig | TEDxMidAtlantic
Citizens United is a conservative 501(c)(4) nonprofit organization in the United States founded in 1988. In 2010 the organization won a U.S. Supreme Court case known as Citizens United v. FEC, which struck down as unconstitutional a federal law prohibiting corporations and unions from making expenditures in connection with federal elections. The organization's current president and chairman is David Bossie.
Overview
Citizens United's stated mission is to restore the United States government to "citizens' control, through a combination of education, advocacy, and grass-roots organization" seeking to "reassert the traditional American values of limited government, freedom of enterprise, strong families, and national sovereignty and security."
Citizens United is a conservative political advocacy group organized under Section 501(c)3 of the federal tax code, meaning that donations are not tax deductible. To fulfill this mission, Citizens United produces television commercials, web advertisements, and documentary films.
CU films have won film festival awards, including Perfect Valor (Best Documentary at the GI Film Festival) and Ronald Reagan: Rendezvous with Destiny (Remi Award at Houston Worldfest International Festival).
David Bossie has been its president since 2000. In 2016 he took a leave of absence to be deputy campaign manager of Donald Trump's campaign for President of the United States. Its offices are on Pennsylvania Avenue in the Capitol Hill area of Washington, D.C.
Citizens United and the Trump Administration:
In 2016 the Donald Trump presidential campaign enlisted Citizens United president David Bossie as deputy campaign manager. During the campaign, Bossie made regular television appearances on behalf of the Trump campaign. Bossie is a close friend and longtime acquaintance of Trump administration officials Steve Bannon and Kellyanne Conway having introduced Bannon to Trump in 2011.
Citizens United Productions:
Citizens United Productions, headed by president David Bossie, has released 25 feature-length documentaries. The following is a list of films produced by Citizens United Productions.
Citizens United v. Federal Election Commission
Main article: Citizens United v. Federal Election Commission
Citizens United was the plaintiff in a Supreme Court case that began as a challenge to various statutory provisions of the Bipartisan Campaign Reform Act of 2002 (BCRA), known as the "McCain-Feingold" law.
The case revolved around the documentary Hillary: The Movie, which was produced by Citizens United. Under the McCain-Feingold law, a federal court in Washington D.C. ruled that Citizens United would be barred from advertising its film.
The case (08-205, 558 U.S. 50 (2010)) was heard in the United States Supreme Court on March 24, 2009. During oral argument, the government argued that under existing precedents, it had the power under the Constitution to prohibit the publication of books and movies if they were made or sold by corporations.
After that hearing, the Court requested re-argument specifically to address whether deciding the case required the Court to reconsider those earlier decisions in Austin v. Michigan Chamber of Commerce and McConnell v. FEC. The case was re-argued on September 9.
On January 21, 2010, the Supreme Court overturned the provision of McCain-Feingold barring corporations and unions from paying for political ads made independently of candidate campaigns. A dissenting opinion by Justice Stevens was joined by Justice Ginsburg, Justice Breyer, and Justice Sotomayor.
Funding:
Citizens United has accepted funding from The Presidential Coalition, LLC, and the Koch brothers.
See also:
End Citizens United:
End Citizens United (ECU) is a political action committee in the United States. The organization is working to reverse the U.S. Supreme Court 2010 decision in Citizens United v. Federal Election Commission(above), which deregulated limits on independent expenditure group spending for (or against) specific candidates.
ECU is focused on driving larger campaign donations out of politics with a goal to elect "campaign-finance reform champions" to Congress by contributing and raising money for these candidates as well as running independent expenditures. End Citizens United was founded in 2015, operating in its first election cycle during 2016 with more than $25 million in funding.
The organization has endorsed Democratic candidates such as Zephyr Teachout, Hillary Clinton, Russ Feingold, Beto O'Rourke, Elizabeth Warren and Jon Ossoff.
For the 2016 election, it was one of the largest outside groups funding the campaigns of U.S. Senators Maggie Hassan and Catherine Cortez Masto, spending a combined $4.4 million on the races.
By mid-2017, End Citizens United had raised more than $7.5 million from grassroots donations, and planned to raise $35 million for the 2018 election cycle.
In the spring of 2018, an anonymous U.S.-based contractor paid at least 3,800 micro job workers to manipulate what stories would come up when people searched for the PAC via Google.
See also:
Overview
Citizens United's stated mission is to restore the United States government to "citizens' control, through a combination of education, advocacy, and grass-roots organization" seeking to "reassert the traditional American values of limited government, freedom of enterprise, strong families, and national sovereignty and security."
Citizens United is a conservative political advocacy group organized under Section 501(c)3 of the federal tax code, meaning that donations are not tax deductible. To fulfill this mission, Citizens United produces television commercials, web advertisements, and documentary films.
CU films have won film festival awards, including Perfect Valor (Best Documentary at the GI Film Festival) and Ronald Reagan: Rendezvous with Destiny (Remi Award at Houston Worldfest International Festival).
David Bossie has been its president since 2000. In 2016 he took a leave of absence to be deputy campaign manager of Donald Trump's campaign for President of the United States. Its offices are on Pennsylvania Avenue in the Capitol Hill area of Washington, D.C.
Citizens United and the Trump Administration:
In 2016 the Donald Trump presidential campaign enlisted Citizens United president David Bossie as deputy campaign manager. During the campaign, Bossie made regular television appearances on behalf of the Trump campaign. Bossie is a close friend and longtime acquaintance of Trump administration officials Steve Bannon and Kellyanne Conway having introduced Bannon to Trump in 2011.
Citizens United Productions:
Citizens United Productions, headed by president David Bossie, has released 25 feature-length documentaries. The following is a list of films produced by Citizens United Productions.
- ACLU: At War with America
- America at Risk
- Battle for America
- Blocking 'The Path to 9/11'
- Border War: The Battle Over Illegal Immigration
- Broken Promises: The UN at 60
- Celsius 41.11
- A City Upon a Hill
- Fast Terry – documentary on Terry McAuliffe's controversial business deals
- Fire From the Heartland: The Awakening of the Conservative Woman
- Generation Zero
- The Gift of Life
- Hillary: The Movie
- The Hope and the Change – release date September 2012
- HYPE: The Obama Effect
- Nine Days that Changed the World – hosted by Newt Gingrich and his wife, Callista Gingrich
- Occupy Unmasked
- Perfect Valor
- Rediscovering God in America
- Rediscovering God in America II: Our Heritage
- Rocky Mountain Heist
- Ronald Reagan: Rendezvous with Destiny
- We Have the Power: Making America Energy Independent
- Torchbearer
Citizens United v. Federal Election Commission
Main article: Citizens United v. Federal Election Commission
Citizens United was the plaintiff in a Supreme Court case that began as a challenge to various statutory provisions of the Bipartisan Campaign Reform Act of 2002 (BCRA), known as the "McCain-Feingold" law.
The case revolved around the documentary Hillary: The Movie, which was produced by Citizens United. Under the McCain-Feingold law, a federal court in Washington D.C. ruled that Citizens United would be barred from advertising its film.
The case (08-205, 558 U.S. 50 (2010)) was heard in the United States Supreme Court on March 24, 2009. During oral argument, the government argued that under existing precedents, it had the power under the Constitution to prohibit the publication of books and movies if they were made or sold by corporations.
After that hearing, the Court requested re-argument specifically to address whether deciding the case required the Court to reconsider those earlier decisions in Austin v. Michigan Chamber of Commerce and McConnell v. FEC. The case was re-argued on September 9.
On January 21, 2010, the Supreme Court overturned the provision of McCain-Feingold barring corporations and unions from paying for political ads made independently of candidate campaigns. A dissenting opinion by Justice Stevens was joined by Justice Ginsburg, Justice Breyer, and Justice Sotomayor.
Funding:
Citizens United has accepted funding from The Presidential Coalition, LLC, and the Koch brothers.
See also:
- Citizens United official website
- Citizens United on IMDb
End Citizens United:
End Citizens United (ECU) is a political action committee in the United States. The organization is working to reverse the U.S. Supreme Court 2010 decision in Citizens United v. Federal Election Commission(above), which deregulated limits on independent expenditure group spending for (or against) specific candidates.
ECU is focused on driving larger campaign donations out of politics with a goal to elect "campaign-finance reform champions" to Congress by contributing and raising money for these candidates as well as running independent expenditures. End Citizens United was founded in 2015, operating in its first election cycle during 2016 with more than $25 million in funding.
The organization has endorsed Democratic candidates such as Zephyr Teachout, Hillary Clinton, Russ Feingold, Beto O'Rourke, Elizabeth Warren and Jon Ossoff.
For the 2016 election, it was one of the largest outside groups funding the campaigns of U.S. Senators Maggie Hassan and Catherine Cortez Masto, spending a combined $4.4 million on the races.
By mid-2017, End Citizens United had raised more than $7.5 million from grassroots donations, and planned to raise $35 million for the 2018 election cycle.
In the spring of 2018, an anonymous U.S.-based contractor paid at least 3,800 micro job workers to manipulate what stories would come up when people searched for the PAC via Google.
See also:
Advocacy Group, including a List of Advocacy Groups
- YouTube Video: Six Steps to Effective Advocacy Campaigns
- YouTube Video: Greenpeace: Inspiring Action
- YouTube Video: People for the Ethical Treatment of Animals ("PETA")
Your WebHost: the above image is taken from an article by a political advocacy group fighting for "Medicare for All": click on the following link for more: "LOBBYIST DOCUMENTS REVEAL HEALTH CARE INDUSTRY BATTLE PLAN AGAINST “MEDICARE FOR ALL”
For a List of Advocacy Groups, click here.
Advocacy groups, also known as special interest groups, use various forms of advocacy in order to influence public opinion and ultimately policy. They play an important role in the development of political and social systems.
Motives for action may be based on political, religious, moral, or commercial positions.
Groups use varied methods to try to achieve their aims including lobbying, media campaigns, publicity stunts, polls, research, and policy briefings. Some groups are supported or backed by powerful business or political interests and exert considerable influence on the political process, while others have few or no such resources.
Some have developed into important social, political institutions or social movements.
Some powerful advocacy groups have been accused of manipulating the democratic system for narrow commercial gain and in some instances have been found guilty of corruption, fraud, bribery, and other serious crimes; lobbying has become increasingly regulated as a result.
Some groups, generally ones with less financial resources, may use direct action and civil disobedience and in some cases are accused of being a threat to the social order or 'domestic extremists'. Research is beginning to explore how advocacy groups use social media to facilitate civic engagement and collective action.
Overview:
See also: Classification of advocacy groups
An advocacy group is a group or an organization which tries to influence the government but does not hold power in the government.
Activities:
Advocacy groups exist in a wide variety of genres based upon their most pronounced activities.
Influence:
In most liberal democracies, advocacy groups tend to use the bureaucracy as the main channel of influence – because, in liberal democracies, this is where the decision-making power lies. The aim of advocacy groups here is to attempt to influence a member of the legislature to support their cause by voting a certain way in the legislature.
Access to this channel is generally restricted to groups with insider status such as large corporations and trade unions – groups with outsider status are unlikely to be able to meet with ministers or other members of the bureaucracy to discuss policy.
What must be understood about groups exerting influence in the bureaucracy is; "the crucial relationship here [in the bureaucracy] is usually that between the senior bureaucrats and leading business or industrial interests". This supports the view that groups with greater financial resources at their disposal will generally be better able to influence the decision-making process of government.
The advantages that large businesses have is mainly due to the fact that they are key producers within their countries economy and, therefore, their interests are important to the government as their contributions are important to the economy. According to George Monbiot, the influence of big business has been strengthened by "the greater ease with which corporations can relocate production and investment in a global economy". This suggests that in the ever modernizing world, big business has an increasing role in influencing the bureaucracy and in turn, the decision-making process of government.
Advocacy groups can also exert influence through the assembly by lobbying. Groups with greater economic resources at their disposal can employ professional lobbyists to try and exert influence in the assembly.
An example of such a group is the environmentalist group Greenpeace; Greenpeace (an organization with income upward of $50,000,000) use lobbying to gain political support for their campaigns. They raise issues about the environment with the aim of having their issues translated into policy such as the government encouraging alternative energy and recycling.
The judicial branch of government can also be used by advocacy groups to exert influence. In states where legislation cannot be challenged by the courts, like the UK, advocacy groups are limited in the amount of influence they have. In states that have codified constitutions, like the US, however, advocacy group influence is much more significant.
For example, – in 1954 the NAACP (National Association for the Advancement of Colored People) lobbied against the Topeka Board of education, arguing that segregation of education based on race was unconstitutional. As a result of group pressure from the NAACP, the supreme court unanimously ruled that racial segregation in education was indeed unconstitutional and such practices were banned. This is a novel example of how advocacy groups can exert influence in the judicial branch of government.
Advocacy groups can also exert influence on political parties. The main way groups do this is through campaign finance. For instance; in the UK, the conservative parties campaigns are often funded by large corporations, as many of the conservative parties campaigns reflect the interests of businesses. For example, George W. Bush's re-election campaign in 2004 was the most expensive in American history and was financed mainly by large corporations and industrial interests that the Bush administration represented in government.
Conversely, left-wing parties are often funded by organised labour – when the British Labour Party was formed, it was largely funded by trade unions. Often, political parties are actually formed as a result of group pressure, for example, the Labour Party in the UK was formed out of the new trade-union movement which lobbied for the rights of workers.
Advocacy groups also exert influence through channels that are separate from the government or the political structure such as the mass media and through public opinion campaigning. Advocacy groups will use methods such as protesting, petitioning and civil disobedience to attempt to exert influence in Liberal Democracies.
Groups will generally use two distinct styles when attempting to manipulate the media – they will either put across their outsider status and use their inability to access the other channels of influence to gain sympathy or they may put across a more ideological agenda.
Traditionally, a prime example of such a group were the trade-unions who were the so-called "industrial" muscle. Trade-unions would campaign in the forms of industrial action and marches for workers rights, these gained much media attention and sympathy for their cause.
In the United States, the Civil Rights Movement gained much of its publicity through civil disobedience; African Americans would simply disobey the racist segregation laws to get the violent, racist reaction from the police and white Americans. This violence and racism was then broadcast all over the world, showing the world just how one sided the race 'war' in America actually was.
Advocacy group influence has also manifested itself in supranational bodies that have arisen through globalization. Groups that already had a global structure such as Greenpeace were better able to adapt to globalization. Greenpeace, for example, have offices in over 30 countries and has an income of $50 million annually. Groups such as these have secured the nature of their influence by gaining status as nongovernmental organizations (NGOs), many of which oversee the work of the UN and the EU from their permanent offices in America and Europe.
Group pressure by supranational industries can be exerted in a number of ways: "through direct lobbying by large corporations, national trade bodies and 'peak' associations such as the European Round Table of Industrialists".
Influential advocacy groups:
See also: social movement and Category:Advocacy groups
There have been many significant advocacy groups throughout history, some of which could operated with dynamics that could better categorize them as social movements. Here are some notable advocacy groups operating in different parts of the world:
Adversarial groupings:
On some controversial issues there are a number of competing advocacy groups, sometimes with very different resources available to them:
Benefits and incentives:
Free rider problem:
A general theory is that individuals must be enticed with some type of benefit to join an interest group. However, the free rider problem addresses the difficulty of obtaining members of a particular interest group when the benefits are already reaped without membership.
For instance, an interest group dedicated to improving farming standards will fight for the general goal of improving farming for every farmer, even those who are not members of that particular interest group. Thus, there is no real incentive to join an interest group and pay dues if the farmer will receive that benefit anyway.
For another example, every individual in the world would benefit from a cleaner environment, but environmental protection interest groups do not receive monetary help from every individual in the world.
This poses a problem for interest groups, which require dues from their members and contributions in order to accomplish the groups' agendas.
Selective benefits:
Selective benefits are material, rather than monetary benefits conferred on group members. For instance, an interest group could give members travel discounts, free meals at certain restaurants, or free subscriptions to magazines, newspapers, or journals. Many trade and professional interest groups tend to give these types of benefits to their members.
Solidarity incentives:
A solidary incentive is a reward for participation that is socially derived and created out of the act of association. A selective solidary benefit offered to members or prospective members of an interest group might involve such incentives as "socializing congeniality, the sense of group membership and identification, the status resulting from membership, fun, conviviality, the maintenance of social distinctions, and so on.
Expressive incentives;
People who join an interest group because of expressive benefits likely joined to express an ideological or moral value that they believe in, such as free speech, civil rights, economic justice, or political equality. To obtain these types of benefits, members would simply pay dues, and donate their time or money to get a feeling of satisfaction from expressing a political value.
Also, it would not matter if the interest group achieved their goal; these members would merely be able to say they helped out in the process of trying to obtain their goals, which is the expressive incentive that they got in the first place. The types of interest groups that rely on expressive benefits or incentives are environmental groups and groups who claim to be lobbying for the public interest.
Latent interests:
Some public policy interests are not recognized or addressed by a group at all. These interests are labeled latent interests.
Theoretical perspectives:
Much work has been undertaken by academics attempting to categorize how advocacy groups operate, particularly in relation to governmental policy creation. The field is dominated by numerous and diverse schools of thought:
Social media use:
A study published in early 2012 suggests that advocacy groups of varying political and ideological orientations operating in the United States are using social media to interact with citizens every day. The study surveyed 53 groups, that were found to be using a variety of social media technologies to achieve organizational and political goals:
As noted in the study, "while some groups raised doubts about social media’s ability to overcome the limitations of weak ties and generational gaps, an overwhelming majority of groups see social media as essential to contemporary advocacy work and laud its democratizing function."
See also:
For a List of Advocacy Groups, click here.
Advocacy groups, also known as special interest groups, use various forms of advocacy in order to influence public opinion and ultimately policy. They play an important role in the development of political and social systems.
Motives for action may be based on political, religious, moral, or commercial positions.
Groups use varied methods to try to achieve their aims including lobbying, media campaigns, publicity stunts, polls, research, and policy briefings. Some groups are supported or backed by powerful business or political interests and exert considerable influence on the political process, while others have few or no such resources.
Some have developed into important social, political institutions or social movements.
Some powerful advocacy groups have been accused of manipulating the democratic system for narrow commercial gain and in some instances have been found guilty of corruption, fraud, bribery, and other serious crimes; lobbying has become increasingly regulated as a result.
Some groups, generally ones with less financial resources, may use direct action and civil disobedience and in some cases are accused of being a threat to the social order or 'domestic extremists'. Research is beginning to explore how advocacy groups use social media to facilitate civic engagement and collective action.
Overview:
See also: Classification of advocacy groups
An advocacy group is a group or an organization which tries to influence the government but does not hold power in the government.
Activities:
Advocacy groups exist in a wide variety of genres based upon their most pronounced activities.
- Anti-defamation organizations issue responses or criticisms to real or supposed slights of any sort (including speech or violence) by an individual or group against a specific segment of the population which the organization exists to represent.
- Watchdog groups exist to provide oversight and rating of actions or media by various outlets, both government and corporate. They may also index personalities, organizations, products, and activities in databases to provide coverage and rating of the value or viability of such entities to target demographics.
- Lobby groups lobby for a change to the law or the maintenance of a particular law and big businesses fund very considerable lobbying influence on legislators, for example in the USA and in the UK where lobbying first developed. Some Lobby groups have considerable financial resources at their disposal. Lobbying is regulated to stop the worst abuses which can develop into corruption. In the United States the Internal Revenue Service makes a clear distinction between lobbying and advocacy.
- Lobby groups spend considerable amounts of money on election advertising as well. For example, the 2011 documentary film Hot Coffee contains interviews of former Mississippi Supreme Court Justice Oliver E. Diaz, Jr. and evidence the US Chamber of Commerce paid for advertising to unseat him.
- Legal defense funds provide funding for the legal defense for, or legal action against, individuals or groups related to their specific interests or target demographic. This is often accompanied by one of the above types of advocacy groups filing an amicus curiae if the cause at stake serves the interests of both the legal defense fund and the other advocacy groups.
Influence:
In most liberal democracies, advocacy groups tend to use the bureaucracy as the main channel of influence – because, in liberal democracies, this is where the decision-making power lies. The aim of advocacy groups here is to attempt to influence a member of the legislature to support their cause by voting a certain way in the legislature.
Access to this channel is generally restricted to groups with insider status such as large corporations and trade unions – groups with outsider status are unlikely to be able to meet with ministers or other members of the bureaucracy to discuss policy.
What must be understood about groups exerting influence in the bureaucracy is; "the crucial relationship here [in the bureaucracy] is usually that between the senior bureaucrats and leading business or industrial interests". This supports the view that groups with greater financial resources at their disposal will generally be better able to influence the decision-making process of government.
The advantages that large businesses have is mainly due to the fact that they are key producers within their countries economy and, therefore, their interests are important to the government as their contributions are important to the economy. According to George Monbiot, the influence of big business has been strengthened by "the greater ease with which corporations can relocate production and investment in a global economy". This suggests that in the ever modernizing world, big business has an increasing role in influencing the bureaucracy and in turn, the decision-making process of government.
Advocacy groups can also exert influence through the assembly by lobbying. Groups with greater economic resources at their disposal can employ professional lobbyists to try and exert influence in the assembly.
An example of such a group is the environmentalist group Greenpeace; Greenpeace (an organization with income upward of $50,000,000) use lobbying to gain political support for their campaigns. They raise issues about the environment with the aim of having their issues translated into policy such as the government encouraging alternative energy and recycling.
The judicial branch of government can also be used by advocacy groups to exert influence. In states where legislation cannot be challenged by the courts, like the UK, advocacy groups are limited in the amount of influence they have. In states that have codified constitutions, like the US, however, advocacy group influence is much more significant.
For example, – in 1954 the NAACP (National Association for the Advancement of Colored People) lobbied against the Topeka Board of education, arguing that segregation of education based on race was unconstitutional. As a result of group pressure from the NAACP, the supreme court unanimously ruled that racial segregation in education was indeed unconstitutional and such practices were banned. This is a novel example of how advocacy groups can exert influence in the judicial branch of government.
Advocacy groups can also exert influence on political parties. The main way groups do this is through campaign finance. For instance; in the UK, the conservative parties campaigns are often funded by large corporations, as many of the conservative parties campaigns reflect the interests of businesses. For example, George W. Bush's re-election campaign in 2004 was the most expensive in American history and was financed mainly by large corporations and industrial interests that the Bush administration represented in government.
Conversely, left-wing parties are often funded by organised labour – when the British Labour Party was formed, it was largely funded by trade unions. Often, political parties are actually formed as a result of group pressure, for example, the Labour Party in the UK was formed out of the new trade-union movement which lobbied for the rights of workers.
Advocacy groups also exert influence through channels that are separate from the government or the political structure such as the mass media and through public opinion campaigning. Advocacy groups will use methods such as protesting, petitioning and civil disobedience to attempt to exert influence in Liberal Democracies.
Groups will generally use two distinct styles when attempting to manipulate the media – they will either put across their outsider status and use their inability to access the other channels of influence to gain sympathy or they may put across a more ideological agenda.
Traditionally, a prime example of such a group were the trade-unions who were the so-called "industrial" muscle. Trade-unions would campaign in the forms of industrial action and marches for workers rights, these gained much media attention and sympathy for their cause.
In the United States, the Civil Rights Movement gained much of its publicity through civil disobedience; African Americans would simply disobey the racist segregation laws to get the violent, racist reaction from the police and white Americans. This violence and racism was then broadcast all over the world, showing the world just how one sided the race 'war' in America actually was.
Advocacy group influence has also manifested itself in supranational bodies that have arisen through globalization. Groups that already had a global structure such as Greenpeace were better able to adapt to globalization. Greenpeace, for example, have offices in over 30 countries and has an income of $50 million annually. Groups such as these have secured the nature of their influence by gaining status as nongovernmental organizations (NGOs), many of which oversee the work of the UN and the EU from their permanent offices in America and Europe.
Group pressure by supranational industries can be exerted in a number of ways: "through direct lobbying by large corporations, national trade bodies and 'peak' associations such as the European Round Table of Industrialists".
Influential advocacy groups:
See also: social movement and Category:Advocacy groups
There have been many significant advocacy groups throughout history, some of which could operated with dynamics that could better categorize them as social movements. Here are some notable advocacy groups operating in different parts of the world:
- American Israel Public Affairs Committee (AIPAC), the American Israel lobby, which is described by The New York Times as the "most influential Lobby impacting US relations with Israel."
- British Medical Association, which formed at a meeting of 50 doctors in 1832 for the sharing of knowledge; its lobbying led to the Medical Act 1858 and the formation of the General Medical Council which has registered and regulated doctors in the UK to this date.
- Campaign for Nuclear Disarmament, which has advocated for the non-proliferation of nuclear weapons and unilateral nuclear disarmament in the UK since 1957, and whose logo is now an international peace symbol.
- Center for Auto Safety, an organization formed in 1970 which aims to give consumers a voice for auto safety and quality in the United States.
- Communion and Liberation ("Italian: Comunione e Liberazione"), it created a lot of Conflicts of Interest in many private and public companies in Italy since 1970s till today and it has been investigated by Italian authorities for many legal issues regarding bribery, corruption and frauds.
- Drug Policy Alliance, whose principle goal is to end the American "War on Drugs".
- Electronic Frontier Foundation, an international non-profit digital rights advocacy and legal organization based in the United States.
- Energy Lobby, an umbrella term for the representatives of large oil, gas, coal, and electric utilities corporations that attempt to influence governmental policy in the United States.
- Financial Services Roundtable, an organization representing the banking lobby.
- Greenpeace, an organization formed in 1970 as the Don't Make a Wave Committee to stop nuclear weapons testing in the United States.
- The Human Rights Campaign, an LGBT civil rights advocacy and lobbying organization seeking to advance the cause of LGBT rights in America.
- National Rifle Association ("NRA"), an organization that formed in New York in 1871 to protect the rights of gun owners.
- Oxfam, an organization formed in 1942 in the UK as the Oxford Committee for Famine Relief.
- Pennsylvania Abolition Society, which formed in Philadelphia in 1775 with a mission to abolish slavery in the United States.
- People for the Ethical Treatment of Animals ("PETA"), an animal rights organization that focuses primarily on the treatment of animals on factory farms, in the clothing trade, in laboratories, and in the entertainment industry.
- Royal Society for the Protection of Birds, founded in Manchester in 1889 to campaign against the 'barbarous trade in plumes for women's hats'.
- Sierra Club, which formed in 1892 to help protect the Sierra Nevada.
- Stop the War Coalition, an organization against the War on Terrorism, which organized a march of between 750,000 and 2,000,000 people in London in 2003.
- Suffragettes, which sought to gain voting rights for women through direct action and hunger strikes from 1865–1928 in the United Kingdom.
- The Affiliated Residential Park Residents Association Incorporated (ARPRA), which was established in 1986 to represent residents of residential parks in New South Wales, Australia.
- The Puntland Human Rights Association, which advocates for the rights of children, women, and minority groups in Somalia and was founded on December 2006, in the Gardo, Puntland regions of Somalia.
- Sunday School movement, which formed circa 1751 to promote universal schooling in the UK.
- Tory Party ("Tories"), which formed in 1678 to fight the British Exclusion Bill and developed into one of the first political parties; now known as the Conservative Party
- US Chamber of Commerce, by far the biggest lobby group in the US by expenditures.
- World Economic Forum a deregulation lobby group for large multinationals.
Adversarial groupings:
On some controversial issues there are a number of competing advocacy groups, sometimes with very different resources available to them:
- Pro-choice vs Pro-life movements (abortion policy in the United States)
- SPEAK campaign vs Pro-Test (animal testing in United Kingdom)
- The Automobile Association vs Pedestrians' Association (now 'Living Streets') (road safety in the United Kingdom since 1929)
- Tobacco Institute vs Action on Smoking and Health (tobacco legislation)
- Flying Matters vs Plane Stupid (aviation policy in the United Kingdom since 2007)
Benefits and incentives:
Free rider problem:
A general theory is that individuals must be enticed with some type of benefit to join an interest group. However, the free rider problem addresses the difficulty of obtaining members of a particular interest group when the benefits are already reaped without membership.
For instance, an interest group dedicated to improving farming standards will fight for the general goal of improving farming for every farmer, even those who are not members of that particular interest group. Thus, there is no real incentive to join an interest group and pay dues if the farmer will receive that benefit anyway.
For another example, every individual in the world would benefit from a cleaner environment, but environmental protection interest groups do not receive monetary help from every individual in the world.
This poses a problem for interest groups, which require dues from their members and contributions in order to accomplish the groups' agendas.
Selective benefits:
Selective benefits are material, rather than monetary benefits conferred on group members. For instance, an interest group could give members travel discounts, free meals at certain restaurants, or free subscriptions to magazines, newspapers, or journals. Many trade and professional interest groups tend to give these types of benefits to their members.
Solidarity incentives:
A solidary incentive is a reward for participation that is socially derived and created out of the act of association. A selective solidary benefit offered to members or prospective members of an interest group might involve such incentives as "socializing congeniality, the sense of group membership and identification, the status resulting from membership, fun, conviviality, the maintenance of social distinctions, and so on.
Expressive incentives;
People who join an interest group because of expressive benefits likely joined to express an ideological or moral value that they believe in, such as free speech, civil rights, economic justice, or political equality. To obtain these types of benefits, members would simply pay dues, and donate their time or money to get a feeling of satisfaction from expressing a political value.
Also, it would not matter if the interest group achieved their goal; these members would merely be able to say they helped out in the process of trying to obtain their goals, which is the expressive incentive that they got in the first place. The types of interest groups that rely on expressive benefits or incentives are environmental groups and groups who claim to be lobbying for the public interest.
Latent interests:
Some public policy interests are not recognized or addressed by a group at all. These interests are labeled latent interests.
Theoretical perspectives:
Much work has been undertaken by academics attempting to categorize how advocacy groups operate, particularly in relation to governmental policy creation. The field is dominated by numerous and diverse schools of thought:
- Pluralism: This is based upon the understanding that advocacy groups operate in competition with one another and play a key role in the political system. They do this by acting as a counterweight to undue concentrations of power. However, this pluralist theory (formed primarily by American academics) reflects a more open and fragmented political system similar to that in countries such as the United States.
- Neo-pluralism: Under neo-pluralism, a concept of political communities developed that is more similar to the British form of government. This is based on the concept of political communities in that advocacy groups and other such bodies are organised around a government department and its network of client groups. The members of this network co-operate together during the policy making process.
- Corporatism or elitism: Some advocacy groups are backed by private businesses which can have a considerable influence on legislature.
Social media use:
A study published in early 2012 suggests that advocacy groups of varying political and ideological orientations operating in the United States are using social media to interact with citizens every day. The study surveyed 53 groups, that were found to be using a variety of social media technologies to achieve organizational and political goals:
- Facebook was the social media site of choice with all but one group noting that they use the site to connect with citizens.
- Twitter was also popular with all but two groups saying that they use Twitter.
- Other social media being used included the following:
As noted in the study, "while some groups raised doubts about social media’s ability to overcome the limitations of weak ties and generational gaps, an overwhelming majority of groups see social media as essential to contemporary advocacy work and laud its democratizing function."
See also:
- History
- Campaign finance in the United States
- Client politics
- Identity politics
- Methods used by advocacy groups
- Political party
- Pressure groups in the United Kingdom
- Pressure politics
Corruption Perceptions Index along with the Democracy Index
- YouTube Video: Corruption Perceptions Index Explained | Transparency International
- YouTube Video: Corruption Perceptions Index 2018 | Transparency International
- YouTube Video: How the rich get richer – money in the world economy | DW Documentary
Pictured below: Democracy Index Courtesy of BlankMap-World6.svg: Canuckguy (talk) and many others (see File history)Derivative work: WeifengYang
The Corruption Perceptions Index (CPI) is an index published annually by Transparency International since 1995 which ranks countries "by their perceived levels of public sector corruption, as determined by expert assessments and opinion surveys." The CPI generally defines corruption as "the misuse of public power for private benefit".
The 2019 CPI, published in January of 2020, currently ranks 180 countries "on a scale from 100 (very clean) to 0 (highly corrupt)". Denmark, New Zealand, and Finland are perceived as the least corrupt nations in the world, ranking consistently high among international financial transparency, while the most perceived corrupt country in the world is Somalia, ranking at 9–10 out of 100 since 2017. South Sudan is also perceived as one of the most corrupted countries in the world due to constant social and economic crises, ranking an average score of 13 out of 100 in 2018.
Methods:
Transparency International commissioned the University of Passau's Johann Graf Lambsdorff to produce the CPI. The 2012 CPI takes into account 16 different surveys and assessments from 12 different institutions. The 13 surveys/assessments are either business people opinion surveys or performance assessments from a group of analysts.
Early CPIs used public opinion surveys. The institutions are:
Countries need to be evaluated by at least three sources to appear in the CPI. The CPI measures perception of corruption due to the difficulty of measuring absolute levels of corruption.
Validity
A study published in 2002 found a "very strong significant correlation" between the Corruption Perceptions Index and two other proxies for corruption: black market activity and overabundance of regulation.
All three metrics also had a highly significant correlation with real gross domestic product per capita (RGDP/Cap); the Corruption Perceptions Index correlation with RGDP/Cap was the strongest, explaining over three fourths of the variance. (Note that a lower index on this scale reflects greater corruption, so that countries with higher RGDPs generally had less corruption.)
Economic implications:
Research papers published in 2007 and 2008 examined the economic consequences of corruption perception, as defined by the CPI. The researchers found a correlation between a higher CPI and higher long-term economic growth, as well as an increase in GDP growth of 1.7% for every unit increase in a country's CPI score. Also shown was a power-law dependence linking higher CPI score to higher rates of foreign investment in a country.
Click on any of the following blue hyperlinks for more about the Corruption Perceptions Index:
The Democracy Index is an index compiled by the Economist Intelligence Unit (EIU), a UK-based company. It intends to measure the state of democracy in 167 countries, of which 166 are sovereign states and 164 are UN member states.
The index was first published in 2006, with updates for 2008, 2010 and later years. The index is based on 60 indicators grouped in five different categories, measuring pluralism, civil liberties and political culture. In addition to a numeric score and a ranking, the index categorises each country in one of four regime types: full democracies, flawed democracies, hybrid regimes and authoritarian regimes.
Method:
As described in the report, the democracy index is a weighted average based on the answers of 60 questions, each one with either two or three permitted alternative answers.
Most answers are "experts' assessments". Some answers are provided by public-opinion surveys from the respective countries. In the case of countries for which survey results are missing, survey results for similar countries, and expert assessments are used in order to fill in gaps.
The questions are grouped into five categories:
Each answer is converted to a score, either 0 or 1, or for the three-answer questions, 0, 0.5 or 1. With the exceptions mentioned below, within each category, the scores are added, multiplied by ten, and divided by the total number of questions within the category.
There are a few modifying dependencies, which are explained much more precisely than the main rule procedures.
In a few cases, an answer yielding zero for one question voids another question; e.g. if the elections for the national legislature and head of government are not considered free (question 1), then the next question, "Are elections... fair?", is not considered, but automatically scored zero.
Likewise, there are a few questions considered so important that a low score on them yields a penalty on the total score sum for their respective categories, namely:
The five category indices, which are listed in the report, are then averaged to find the Democracy Index for a given country. Finally, the Democracy Index, rounded to two decimals, decides the regime type classification of the country.
The report discusses other indices of democracy, as defined, e.g. by Freedom House, and argues for some of the choices made by the team from the Economist Intelligence Unit. In this comparison, a higher emphasis is placed on the public opinion and attitudes, as measured by surveys, but on the other hand, economic living standards are not weighted as one criterion of democracy (as seemingly some other investigators have done).
The report is widely cited in the international press as well as in peer reviewed academic journals.
Classification definitions:
Full democracies are nations where civil liberties and fundamental political freedoms are not only respected but also reinforced by a political culture conducive to the thriving of democratic principles.
These nations have a valid system of governmental checks and balances, an independent judiciary whose decisions are enforced, governments that function adequately, and diverse and independent media. These nations have only limited problems in democratic functioning.
Flawed democracies are nations where elections are fair and free and basic civil liberties are honoured but may have issues (e.g. media freedom infringement and minor suppression of political opposition and critics). These nations have significant faults in other democratic aspects, including underdeveloped political culture, low levels of participation in politics, and issues in the functioning of governance.
Hybrid regimes are nations with regular electoral frauds, preventing them from being fair and free democracy. These nations commonly have governments that apply pressure on political opposition, non-independent judiciaries, widespread corruption, harassment and pressure placed on the media, anaemic rule of law, and more pronounced faults than flawed democracies in the realms of underdeveloped political culture, low levels of participation in politics, and issues in the functioning of governance.
Authoritarian regimes are nations where political pluralism has vanished or is extremely limited. These nations are often absolute monarchies or dictatorships, may have some conventional institutions of democracy but with meagre significance, infringements and abuses of civil liberties are commonplace, elections (if they take place) are not fair and free, the media is often state-owned or controlled by groups associated with the ruling regime, the judiciary is not independent, and there are omnipresent censorship and suppression of governmental criticism.
Click on any of the following blue hyperlinks for more about the Democracy Index:
The 2019 CPI, published in January of 2020, currently ranks 180 countries "on a scale from 100 (very clean) to 0 (highly corrupt)". Denmark, New Zealand, and Finland are perceived as the least corrupt nations in the world, ranking consistently high among international financial transparency, while the most perceived corrupt country in the world is Somalia, ranking at 9–10 out of 100 since 2017. South Sudan is also perceived as one of the most corrupted countries in the world due to constant social and economic crises, ranking an average score of 13 out of 100 in 2018.
Methods:
Transparency International commissioned the University of Passau's Johann Graf Lambsdorff to produce the CPI. The 2012 CPI takes into account 16 different surveys and assessments from 12 different institutions. The 13 surveys/assessments are either business people opinion surveys or performance assessments from a group of analysts.
Early CPIs used public opinion surveys. The institutions are:
- African Development Bank (based in Ivory Coast)
- Bertelsmann Foundation (based in Germany)
- Economist Intelligence Unit (based in UK)
- Freedom House (based in US)
- Global Insight (based in US)
- International Institute for Management Development (based in Switzerland)
- Political and Economic Risk Consultancy (based in Hong Kong)
- The PRS Group, Inc., (based in US)
- World Economic Forum
- World Bank
- World Justice Project (based in US)
Countries need to be evaluated by at least three sources to appear in the CPI. The CPI measures perception of corruption due to the difficulty of measuring absolute levels of corruption.
Validity
A study published in 2002 found a "very strong significant correlation" between the Corruption Perceptions Index and two other proxies for corruption: black market activity and overabundance of regulation.
All three metrics also had a highly significant correlation with real gross domestic product per capita (RGDP/Cap); the Corruption Perceptions Index correlation with RGDP/Cap was the strongest, explaining over three fourths of the variance. (Note that a lower index on this scale reflects greater corruption, so that countries with higher RGDPs generally had less corruption.)
Economic implications:
Research papers published in 2007 and 2008 examined the economic consequences of corruption perception, as defined by the CPI. The researchers found a correlation between a higher CPI and higher long-term economic growth, as well as an increase in GDP growth of 1.7% for every unit increase in a country's CPI score. Also shown was a power-law dependence linking higher CPI score to higher rates of foreign investment in a country.
Click on any of the following blue hyperlinks for more about the Corruption Perceptions Index:
- Rankings
- Criticism and limitations
- See also:
- Official site
- Transparency International (2010). Corruption Perceptions Index 2010: Long methodological brief (PDF) (Report). Transparency International. Retrieved 24 August 2011.
- Corruption Perceptions Index 2013
- Interactive world map of the Corruption Perception Index: 2000-2008
- A Users' Guide to Measuring Corruption critiques the CPI and similar indices.
- Global Integrity Index
- List of Global Development Indexes and Rankings
The Democracy Index is an index compiled by the Economist Intelligence Unit (EIU), a UK-based company. It intends to measure the state of democracy in 167 countries, of which 166 are sovereign states and 164 are UN member states.
The index was first published in 2006, with updates for 2008, 2010 and later years. The index is based on 60 indicators grouped in five different categories, measuring pluralism, civil liberties and political culture. In addition to a numeric score and a ranking, the index categorises each country in one of four regime types: full democracies, flawed democracies, hybrid regimes and authoritarian regimes.
Method:
As described in the report, the democracy index is a weighted average based on the answers of 60 questions, each one with either two or three permitted alternative answers.
Most answers are "experts' assessments". Some answers are provided by public-opinion surveys from the respective countries. In the case of countries for which survey results are missing, survey results for similar countries, and expert assessments are used in order to fill in gaps.
The questions are grouped into five categories:
- electoral process and pluralism,
- civil liberties,
- the functioning of government,
- political participation,
- and political culture.
Each answer is converted to a score, either 0 or 1, or for the three-answer questions, 0, 0.5 or 1. With the exceptions mentioned below, within each category, the scores are added, multiplied by ten, and divided by the total number of questions within the category.
There are a few modifying dependencies, which are explained much more precisely than the main rule procedures.
In a few cases, an answer yielding zero for one question voids another question; e.g. if the elections for the national legislature and head of government are not considered free (question 1), then the next question, "Are elections... fair?", is not considered, but automatically scored zero.
Likewise, there are a few questions considered so important that a low score on them yields a penalty on the total score sum for their respective categories, namely:
- "Whether national elections are free and fair";
- "The security of voters";
- "The influence of foreign powers on government";
- "The capability of the civil servants to implement policies".
The five category indices, which are listed in the report, are then averaged to find the Democracy Index for a given country. Finally, the Democracy Index, rounded to two decimals, decides the regime type classification of the country.
The report discusses other indices of democracy, as defined, e.g. by Freedom House, and argues for some of the choices made by the team from the Economist Intelligence Unit. In this comparison, a higher emphasis is placed on the public opinion and attitudes, as measured by surveys, but on the other hand, economic living standards are not weighted as one criterion of democracy (as seemingly some other investigators have done).
The report is widely cited in the international press as well as in peer reviewed academic journals.
Classification definitions:
Full democracies are nations where civil liberties and fundamental political freedoms are not only respected but also reinforced by a political culture conducive to the thriving of democratic principles.
These nations have a valid system of governmental checks and balances, an independent judiciary whose decisions are enforced, governments that function adequately, and diverse and independent media. These nations have only limited problems in democratic functioning.
Flawed democracies are nations where elections are fair and free and basic civil liberties are honoured but may have issues (e.g. media freedom infringement and minor suppression of political opposition and critics). These nations have significant faults in other democratic aspects, including underdeveloped political culture, low levels of participation in politics, and issues in the functioning of governance.
Hybrid regimes are nations with regular electoral frauds, preventing them from being fair and free democracy. These nations commonly have governments that apply pressure on political opposition, non-independent judiciaries, widespread corruption, harassment and pressure placed on the media, anaemic rule of law, and more pronounced faults than flawed democracies in the realms of underdeveloped political culture, low levels of participation in politics, and issues in the functioning of governance.
Authoritarian regimes are nations where political pluralism has vanished or is extremely limited. These nations are often absolute monarchies or dictatorships, may have some conventional institutions of democracy but with meagre significance, infringements and abuses of civil liberties are commonplace, elections (if they take place) are not fair and free, the media is often state-owned or controlled by groups associated with the ruling regime, the judiciary is not independent, and there are omnipresent censorship and suppression of governmental criticism.
Click on any of the following blue hyperlinks for more about the Democracy Index:
- Democracy Index by country 2019
- Recent changes
- Criticism
- Democracy Index by regime type
- Democracy Index by region
- See also:
Foreign Electoral Intervention including Electoral Fraud
- YouTube Video: Examples of Voter Fraud Bombshell in NYC
- YouTube Video: Sketchy Kazakh Money Finds Its Way Into President Donald Trump's Dealings | Rachel Maddow | MSNBC
- YouTube Video: Voting: Last Week Tonight with John Oliver (HBO)
Foreign electoral interventions are attempts by governments, covertly or overtly, to influence elections in another country. There are many ways that nations have accomplished regime change abroad, and electoral intervention is only one of those methods.
Theoretical and empirical research on the effect of foreign electoral intervention had been characterized as weak overall as late as 2011; however, since then a number of such studies have been conducted.
One study indicated that the country intervening in most foreign elections is the United States with 81 interventions, followed by Russia (including the former Soviet Union) with 36 interventions from 1946 to 2000—an average of once in every nine competitive elections.
Academic studies:
Measurement of interventions:
A 2019 study by Lührmann et al. at the Varieties of Democracy Instutute in Sweden summarized reports from each country to say that in 2018 the most intense interventions, by means of false information on key political issues, were by China in Taiwan and by Russia in Latvia; the next highest levels were in Bahrain, Qatar and Hungary; the lowest levels were in Trinidad and Tobago, Switzerland and Uruguay.
A 2016 study by Dov H. Levin found that, among 938 global elections examined, the United States and Russia (including its predecessor, the Soviet Union) combined had involved themselves in about one out of nine (117), with the majority of those (68%) being through covert, rather than overt, actions.
The same study found that "on average, an electoral intervention in favor of one side contesting the election will increase its vote share by about 3 percent," an effect large enough to have potentially changed the results in seven out of 14 U.S. presidential elections occurring after 1960.
According to the study, the U.S. intervened in 81 foreign elections between 1946 and 2000, while the Soviet Union or Russia intervened in 36. A 2018 study by Levin found that the electoral interventions determined in "many cases" the identity of the winner. The study also found suggestive evidence that the interventions increased the risk of democratic breakdown in the targeted states.
Typologies of interventions:
In a 2012 study, Corstange and Marinov theorized that there are two types of foreign intervention: partisan intervention, where the foreign power takes a stance on its support for one side, and process intervention, where the foreign power seeks "to support the rules of democratic contest, irrespective of who wins". Their results from 1,703 participants found that partisan interventions had a polarizing effect on political and foreign relations views, with the side favored by the external power more likely to favor improvements in relations between the two, and having the converse effect for those opposed by the power.
In 2018, Jonathan Godinez further elaborated on Corstange and Marinov's theory by proposing that interventions can be specified as globally-motivated intervention, where "a country intervenes in the election of another country for the interests, betterment, or well-being of the international audience," and self-motivated intervention, where "a country intervenes in the election of another country to further the interests, betterment, or well-being of themselves."
Godinez further theorized that the vested interest of an intervening country can be identified by examining a "threefold methodology": the tactics of intervention, stated motivation, and the magnitude of the intervention.
Also in 2012, Shulman and Bloom theorized a number of distinct factors affecting the results of foreign interference:
Additionally, they theorized that national similarities between the foreign and domestic powers would decrease resentment, and may even render the interference welcome. In cases where national autonomy are of primary concern to the electorate, they predicted a diminished effect of the similarity or dissimilarity of the two powers on resentment.
Conversely, they predicted that in cases where national identity was a primary concern, the importance of similarity or dissimilarity would have a greater impact.
Click on any of the following blue hyperlinks for more about Foreign Election Interference:
Electoral fraud, sometimes referred to as election fraud, election manipulation or vote rigging, is illegal interference with the process of an election, either by increasing the vote share of the favored candidate, depressing the vote share of the rival candidates, or both.
What exactly constitutes electoral fraud varies from country to country.
Many kinds of election fraud are outlawed in electoral legislations, but others are in violation of general laws, such as those banning assault, harassment or libel.
Although technically the term "electoral fraud" covers only those acts which are illegal, the term is sometimes used to describe acts which are legal, but considered morally unacceptable, outside the spirit of an election or in violation of the principles of democracy.
Show elections, containing only one candidate, are sometimes classified as electoral fraud, although they may comply with the law and are presented more as referendums.
In national elections, successful electoral fraud can have the effect of a coup d'état, democracy protest or corruption of democracy. In a narrow election, a small amount of fraud may be enough to change the result. Even if the outcome is not affected, the revelation of fraud can reduce voters' confidence in democracy.
Specific Methods:
A list of threats to voting systems, or electoral fraud methods considered as sabotage are kept by the National Institute of Standards and Technology.
Electorate manipulation:
Electoral fraud can occur in advance of voting if the composition of the electorate is altered. The legality of this type of manipulation varies across jurisdictions. Deliberate manipulation of election outcomes is widely considered a violation of the principles of democracy.
Manipulation of demography:
In many cases, it is possible for authorities to artificially control the composition of an electorate in order to produce a foregone result. One way of doing this is to move a large number of voters into the electorate prior to an election, for example by temporarily assigning them land or lodging them in flophouses.
Many countries prevent this with rules stipulating that a voter must have lived in an electoral district for a minimum period (for example, six months) in order to be eligible to vote there.
However, such laws can also be used for demographic manipulation as they tend to disenfranchise those with no fixed address, such as the homeless, travelers, Roma, students (studying full-time away from home), and some casual workers.
Another strategy is to permanently move people into an electoral district, usually through public housing. If people eligible for public housing are likely to vote for a particular party, then they can either be concentrated into one area, thus making their votes count for less, or moved into marginal electorates, where they may tip the balance towards their preferred party. One notable example of this occurred in the City of Westminster in England under Shirley Porter.
Immigration law may also be used to manipulate electoral demography. For instance, Malaysia gave citizenship to immigrants from the neighboring Philippines and Indonesia, together with suffrage, in order for a political party to "dominate" the state of Sabah; this controversial process was known as Project IC.
A method of manipulating primary contests and other elections of party leaders are related to this. People who support one party may temporarily join another party (or vote in a crossover way, when permitted) in order to elect a weak candidate for that party's leadership.
The goal ultimately is to defeat the weak candidate in the general election by the leader of the party that the voter truly supports. There were claims that this method was being utilised in the UK Labour Party leadership election in 2015, where Conservative-leaning Toby Young encouraged Conservatives to join Labour and vote for Jeremy Corbyn in order to "consign Labour to electoral oblivion".
Shortly after, #ToriesForCorbyn trended on Twitter.
Disenfranchisement:
The composition of an electorate may also be altered by disenfranchising some classes of people, rendering them unable to vote. In some cases, states have passed provisions that raised general barriers to voter registration, such as poll taxes, literacy and comprehension tests, and record-keeping requirements, which in practice were applied against minority populations to discriminatory effect.
From the turn of the century into the late 1960s, most African Americans in the southern states of the former Confederacy were disenfranchised by such measures. Corrupt election officials may misuse voting regulations such as a literacy test or requirement for proof of identity or address in such a way as to make it difficult or impossible for their targets to cast a vote.
If such practices discriminate against a religious or ethnic group, they may so distort the political process that the political order becomes grossly unrepresentative, as in the post-Reconstruction or Jim Crow era until the Voting Rights Act of 1965. Felons have been disenfranchised in many states as a strategy to prevent African Americans from voting.
Groups may also be disenfranchised by rules which make it impractical or impossible for them to cast a vote. For example, requiring people to vote within their electorate may disenfranchise serving military personnel, prison inmates, students, hospital patients or anyone else who cannot return to their homes.
Polling can be set for inconvenient days, such as midweek or on holy days of religious groups: for example on the Sabbath or other holy days of a religious group whose teachings determine that voting is prohibited on such a day. Communities may also be effectively disenfranchised if polling places are situated in areas perceived by voters as unsafe, or are not provided within reasonable proximity (rural communities are especially vulnerable to this)
In some cases, voters may be invalidly disenfranchised, which is true electoral fraud. For example, a legitimate voter may be "accidentally" removed from the electoral roll, making it difficult or impossible for the person to vote.
In the Canadian federal election of 1917, during the Great War, the Union government passed the Military Voters Act and the Wartime Elections Act. The Military Voters Act permitted any active military personnel to vote by party only and allowed that party to decide in which electoral district to place that vote. It also enfranchised those women who were directly related or married to an active soldier.
These groups were believed to be disproportionately in favor of the Union government, as that party was campaigning in favor of conscription. The Wartime Elections Act, conversely, disenfranchised particular ethnic groups assumed to be disproportionately in favour of the opposition Liberal Party.
Division of opposition support:
Stanford University professor Beatriz Magaloni described a model governing the behaviour of autocratic regimes. She proposed that ruling parties can maintain political control under a democratic system without actively manipulating votes or coercing the electorate.
Under the right conditions, the democratic system is maneuvered into an equilibrium in which divided opposition parties act as unwitting accomplices to single-party rule. This permits the ruling regime to abstain from illegal electoral fraud.
Preferential voting systems such as score voting, instant-runoff voting, and single transferable vote are designed to prevent systemic electoral manipulation and political duopoly.
Intimidation:
Voter intimidation involves putting undue pressure on a voter or group of voters so that they will vote a particular way, or not at all. Absentee and other remote voting can be more open to some forms of intimidation as the voter does not have the protection and privacy of the polling location.
Intimidation can take a range of forms including verbal, physical, or coercion. This was so common that in 1887, a Kansas Supreme Court in New Perspectives on Election Fraud in The Gilded Age said "[…] physical retaliation constituted only a slight disturbance and would not vitiate an election."
Violence or the threat of violence: In its simplest form, voters from a particular demographic or known to support a particular party or candidate are directly threatened by supporters of another party or candidate or by those hired by them.
In other cases, supporters of a particular party make it known that if a particular village or neighborhood is found to have voted the 'wrong' way, reprisals will be made against that community. Another method is to make a general threat of violence, for example, a bomb threat which has the effect of closing a particular polling place, thus making it difficult for people in that area to vote.
One notable example of outright violence was the 1984 Rajneeshee bioterror attack, where followers of Bhagwan Shree Rajneesh deliberately contaminated salad bars in The Dalles, Oregon, in an attempt to weaken political opposition during county elections.
Attacks on polling places: Polling places in an area known to support a particular party or candidate may be targeted for vandalism, destruction or threats, thus making it difficult or impossible for people in that area to vote.
Legal threats: In this case, voters will be made to believe, accurately or otherwise, that they are not legally entitled to vote, or that they are legally obliged to vote a particular way. Voters who are not confident about their entitlement to vote may also be intimidated by real or implied authority figures who suggest that those who vote when they are not entitled to will be imprisoned, deported or otherwise punished.
For example, in 2004, in Wisconsin and elsewhere voters allegedly received flyers that said, "If you already voted in any election this year, you can’t vote in the Presidential Election", implying that those who had voted in earlier primary elections were ineligible to vote. Also, "If anybody in your family has ever been found guilty of anything you can’t vote in the Presidential Election." Finally, "If you violate any of these laws, you can get 10 years in prison and your children will be taken away from you."
Another method, allegedly used in Cook County, Illinois in 2004, is to falsely tell particular people that they are not eligible to vote.
Coercion: The demographic that controlled the voting ballot would try to persuade others to follow them. By singling out those who were against the majority, people would attempt to switch the voters' decision. Their argument could be that since the majority sides with a certain candidate, they should admit defeat and join the winning side.
If this didn't work, this led to the threatening of violence seen countless times during elections. Coercion, electoral intimidation was seen in the Navy. In 1885 William C. Whitney started an investigation that involved the men in the Navy. As said by Whitney "the vote of the yard was practically coerced and controlled by the foremen. This instance shows how even in the Navy there were still instances of people going to great lengths for the desired elective to win.
Vote buying:
Vote buying occurs when a political party or candidate seeks to buy the vote of a voter in an upcoming election. Vote buying can take various forms such as a monetary exchange, as well as an exchange for necessary goods or services. This practice is often used to incentivise or persuade voters to turn out to elections and vote in a particular way.
Despite the fact that this practice is illegal in many countries such as the United States, Argentina, Mexico, Kenya, Brazil and Nigeria, its prevalence remains worldwide.
In some parts of the United States in the mid- and late 19th century, members of competing parties would vie, sometimes openly and other times with much greater secrecy, to buy and sell votes. Voters would be compensated with cash or the covering of one’s house/tax payment.
To keep the practice of vote buying secret, parties would open fully staffed vote-buying shops. Parties would also hire runners, who would go out into the public and find floating voters and bargain with them to vote for their side.
In England, documentation and stories of vote buying and vote selling are also well known.
The most famous episodes of vote buying came in 18th century England when two or more rich aristocrats spent whatever money it took to win. The notorious "Spendthrift election" came in Northamptonshire in 1768, when three earls spent over £100,000 each to win a seat.
Voters may be given money or other rewards for voting in a particular way, or not voting. In some jurisdictions, the offer or giving of other rewards is referred to as "electoral treating".
Electoral treating remains legal in some jurisdictions, such as in the Seneca Nation of Indians.
Click on anyof the following blue hyperlinks for more about Electoral Fraud:
Theoretical and empirical research on the effect of foreign electoral intervention had been characterized as weak overall as late as 2011; however, since then a number of such studies have been conducted.
One study indicated that the country intervening in most foreign elections is the United States with 81 interventions, followed by Russia (including the former Soviet Union) with 36 interventions from 1946 to 2000—an average of once in every nine competitive elections.
Academic studies:
Measurement of interventions:
A 2019 study by Lührmann et al. at the Varieties of Democracy Instutute in Sweden summarized reports from each country to say that in 2018 the most intense interventions, by means of false information on key political issues, were by China in Taiwan and by Russia in Latvia; the next highest levels were in Bahrain, Qatar and Hungary; the lowest levels were in Trinidad and Tobago, Switzerland and Uruguay.
A 2016 study by Dov H. Levin found that, among 938 global elections examined, the United States and Russia (including its predecessor, the Soviet Union) combined had involved themselves in about one out of nine (117), with the majority of those (68%) being through covert, rather than overt, actions.
The same study found that "on average, an electoral intervention in favor of one side contesting the election will increase its vote share by about 3 percent," an effect large enough to have potentially changed the results in seven out of 14 U.S. presidential elections occurring after 1960.
According to the study, the U.S. intervened in 81 foreign elections between 1946 and 2000, while the Soviet Union or Russia intervened in 36. A 2018 study by Levin found that the electoral interventions determined in "many cases" the identity of the winner. The study also found suggestive evidence that the interventions increased the risk of democratic breakdown in the targeted states.
Typologies of interventions:
In a 2012 study, Corstange and Marinov theorized that there are two types of foreign intervention: partisan intervention, where the foreign power takes a stance on its support for one side, and process intervention, where the foreign power seeks "to support the rules of democratic contest, irrespective of who wins". Their results from 1,703 participants found that partisan interventions had a polarizing effect on political and foreign relations views, with the side favored by the external power more likely to favor improvements in relations between the two, and having the converse effect for those opposed by the power.
In 2018, Jonathan Godinez further elaborated on Corstange and Marinov's theory by proposing that interventions can be specified as globally-motivated intervention, where "a country intervenes in the election of another country for the interests, betterment, or well-being of the international audience," and self-motivated intervention, where "a country intervenes in the election of another country to further the interests, betterment, or well-being of themselves."
Godinez further theorized that the vested interest of an intervening country can be identified by examining a "threefold methodology": the tactics of intervention, stated motivation, and the magnitude of the intervention.
Also in 2012, Shulman and Bloom theorized a number of distinct factors affecting the results of foreign interference:
- Agents of interference: each with a descending effect on resentment caused by their intervention, these being nations, international organizations, non-governmental organizations, and finally individuals.
- Partisanship of interference: whether foreign actors intervene to affect institutions and process broadly, or intervene primarily to favor one side in a contest
- Salience of interference: consisting of two elements. First, "how obvious and well-known is the interference", and second, "how clear and understandable is the intervention?"
Additionally, they theorized that national similarities between the foreign and domestic powers would decrease resentment, and may even render the interference welcome. In cases where national autonomy are of primary concern to the electorate, they predicted a diminished effect of the similarity or dissimilarity of the two powers on resentment.
Conversely, they predicted that in cases where national identity was a primary concern, the importance of similarity or dissimilarity would have a greater impact.
Click on any of the following blue hyperlinks for more about Foreign Election Interference:
- Bolivian election (U.S., 2002)
- Chilean elections
- French election (Libya, 2007 and Russia, 2017)
- German election (Turkey, 2017)
- Guinean election (France, 2010)
- Iranian election (U.S., 1952)
- Israeli elections
- Italian election (U.S., U.S.S.R., and Vatican's role, 1948)
- Japanese elections (U.S., U.S.S.R., 1950s–60s)
- Korean election (U.N., U.S.S.R., 1948)
- Palestinian election (U.S., Israel, 2006)
- Philippines election (U.S., 1953)
- Russian election (U.S., 1996)
- Taiwanese election (China, 2018)
- Togolese election (France, 2010)
- Ukrainian elections
- United Kingdom elections
- United States elections
- See also:
- Murchison letter regarding inadvertent British influence on the 1888 U.S. presidential election
- United States involvement in regime change
- United States involvement in regime change in Latin America
- Russia involvement in regime change
- Cambridge Analytica – British company worked in more than 200 elections around the world, including in Nigeria, the Czech Republic and Argentina.
- Internet Research Agency – Russian company, funded by Russian businessman Yevgeny Prigozhin, was implicated in interference in several elections in Europe and North America.
- Fancy Bear, another Russian conduit for cyberwarfare implicated in interference in several elections in Europe and North America.
- CIA influence on public opinion
- State-sponsored Internet propaganda
Electoral fraud, sometimes referred to as election fraud, election manipulation or vote rigging, is illegal interference with the process of an election, either by increasing the vote share of the favored candidate, depressing the vote share of the rival candidates, or both.
What exactly constitutes electoral fraud varies from country to country.
Many kinds of election fraud are outlawed in electoral legislations, but others are in violation of general laws, such as those banning assault, harassment or libel.
Although technically the term "electoral fraud" covers only those acts which are illegal, the term is sometimes used to describe acts which are legal, but considered morally unacceptable, outside the spirit of an election or in violation of the principles of democracy.
Show elections, containing only one candidate, are sometimes classified as electoral fraud, although they may comply with the law and are presented more as referendums.
In national elections, successful electoral fraud can have the effect of a coup d'état, democracy protest or corruption of democracy. In a narrow election, a small amount of fraud may be enough to change the result. Even if the outcome is not affected, the revelation of fraud can reduce voters' confidence in democracy.
Specific Methods:
A list of threats to voting systems, or electoral fraud methods considered as sabotage are kept by the National Institute of Standards and Technology.
Electorate manipulation:
Electoral fraud can occur in advance of voting if the composition of the electorate is altered. The legality of this type of manipulation varies across jurisdictions. Deliberate manipulation of election outcomes is widely considered a violation of the principles of democracy.
Manipulation of demography:
In many cases, it is possible for authorities to artificially control the composition of an electorate in order to produce a foregone result. One way of doing this is to move a large number of voters into the electorate prior to an election, for example by temporarily assigning them land or lodging them in flophouses.
Many countries prevent this with rules stipulating that a voter must have lived in an electoral district for a minimum period (for example, six months) in order to be eligible to vote there.
However, such laws can also be used for demographic manipulation as they tend to disenfranchise those with no fixed address, such as the homeless, travelers, Roma, students (studying full-time away from home), and some casual workers.
Another strategy is to permanently move people into an electoral district, usually through public housing. If people eligible for public housing are likely to vote for a particular party, then they can either be concentrated into one area, thus making their votes count for less, or moved into marginal electorates, where they may tip the balance towards their preferred party. One notable example of this occurred in the City of Westminster in England under Shirley Porter.
Immigration law may also be used to manipulate electoral demography. For instance, Malaysia gave citizenship to immigrants from the neighboring Philippines and Indonesia, together with suffrage, in order for a political party to "dominate" the state of Sabah; this controversial process was known as Project IC.
A method of manipulating primary contests and other elections of party leaders are related to this. People who support one party may temporarily join another party (or vote in a crossover way, when permitted) in order to elect a weak candidate for that party's leadership.
The goal ultimately is to defeat the weak candidate in the general election by the leader of the party that the voter truly supports. There were claims that this method was being utilised in the UK Labour Party leadership election in 2015, where Conservative-leaning Toby Young encouraged Conservatives to join Labour and vote for Jeremy Corbyn in order to "consign Labour to electoral oblivion".
Shortly after, #ToriesForCorbyn trended on Twitter.
Disenfranchisement:
The composition of an electorate may also be altered by disenfranchising some classes of people, rendering them unable to vote. In some cases, states have passed provisions that raised general barriers to voter registration, such as poll taxes, literacy and comprehension tests, and record-keeping requirements, which in practice were applied against minority populations to discriminatory effect.
From the turn of the century into the late 1960s, most African Americans in the southern states of the former Confederacy were disenfranchised by such measures. Corrupt election officials may misuse voting regulations such as a literacy test or requirement for proof of identity or address in such a way as to make it difficult or impossible for their targets to cast a vote.
If such practices discriminate against a religious or ethnic group, they may so distort the political process that the political order becomes grossly unrepresentative, as in the post-Reconstruction or Jim Crow era until the Voting Rights Act of 1965. Felons have been disenfranchised in many states as a strategy to prevent African Americans from voting.
Groups may also be disenfranchised by rules which make it impractical or impossible for them to cast a vote. For example, requiring people to vote within their electorate may disenfranchise serving military personnel, prison inmates, students, hospital patients or anyone else who cannot return to their homes.
Polling can be set for inconvenient days, such as midweek or on holy days of religious groups: for example on the Sabbath or other holy days of a religious group whose teachings determine that voting is prohibited on such a day. Communities may also be effectively disenfranchised if polling places are situated in areas perceived by voters as unsafe, or are not provided within reasonable proximity (rural communities are especially vulnerable to this)
In some cases, voters may be invalidly disenfranchised, which is true electoral fraud. For example, a legitimate voter may be "accidentally" removed from the electoral roll, making it difficult or impossible for the person to vote.
In the Canadian federal election of 1917, during the Great War, the Union government passed the Military Voters Act and the Wartime Elections Act. The Military Voters Act permitted any active military personnel to vote by party only and allowed that party to decide in which electoral district to place that vote. It also enfranchised those women who were directly related or married to an active soldier.
These groups were believed to be disproportionately in favor of the Union government, as that party was campaigning in favor of conscription. The Wartime Elections Act, conversely, disenfranchised particular ethnic groups assumed to be disproportionately in favour of the opposition Liberal Party.
Division of opposition support:
Stanford University professor Beatriz Magaloni described a model governing the behaviour of autocratic regimes. She proposed that ruling parties can maintain political control under a democratic system without actively manipulating votes or coercing the electorate.
Under the right conditions, the democratic system is maneuvered into an equilibrium in which divided opposition parties act as unwitting accomplices to single-party rule. This permits the ruling regime to abstain from illegal electoral fraud.
Preferential voting systems such as score voting, instant-runoff voting, and single transferable vote are designed to prevent systemic electoral manipulation and political duopoly.
Intimidation:
Voter intimidation involves putting undue pressure on a voter or group of voters so that they will vote a particular way, or not at all. Absentee and other remote voting can be more open to some forms of intimidation as the voter does not have the protection and privacy of the polling location.
Intimidation can take a range of forms including verbal, physical, or coercion. This was so common that in 1887, a Kansas Supreme Court in New Perspectives on Election Fraud in The Gilded Age said "[…] physical retaliation constituted only a slight disturbance and would not vitiate an election."
Violence or the threat of violence: In its simplest form, voters from a particular demographic or known to support a particular party or candidate are directly threatened by supporters of another party or candidate or by those hired by them.
In other cases, supporters of a particular party make it known that if a particular village or neighborhood is found to have voted the 'wrong' way, reprisals will be made against that community. Another method is to make a general threat of violence, for example, a bomb threat which has the effect of closing a particular polling place, thus making it difficult for people in that area to vote.
One notable example of outright violence was the 1984 Rajneeshee bioterror attack, where followers of Bhagwan Shree Rajneesh deliberately contaminated salad bars in The Dalles, Oregon, in an attempt to weaken political opposition during county elections.
Attacks on polling places: Polling places in an area known to support a particular party or candidate may be targeted for vandalism, destruction or threats, thus making it difficult or impossible for people in that area to vote.
Legal threats: In this case, voters will be made to believe, accurately or otherwise, that they are not legally entitled to vote, or that they are legally obliged to vote a particular way. Voters who are not confident about their entitlement to vote may also be intimidated by real or implied authority figures who suggest that those who vote when they are not entitled to will be imprisoned, deported or otherwise punished.
For example, in 2004, in Wisconsin and elsewhere voters allegedly received flyers that said, "If you already voted in any election this year, you can’t vote in the Presidential Election", implying that those who had voted in earlier primary elections were ineligible to vote. Also, "If anybody in your family has ever been found guilty of anything you can’t vote in the Presidential Election." Finally, "If you violate any of these laws, you can get 10 years in prison and your children will be taken away from you."
Another method, allegedly used in Cook County, Illinois in 2004, is to falsely tell particular people that they are not eligible to vote.
Coercion: The demographic that controlled the voting ballot would try to persuade others to follow them. By singling out those who were against the majority, people would attempt to switch the voters' decision. Their argument could be that since the majority sides with a certain candidate, they should admit defeat and join the winning side.
If this didn't work, this led to the threatening of violence seen countless times during elections. Coercion, electoral intimidation was seen in the Navy. In 1885 William C. Whitney started an investigation that involved the men in the Navy. As said by Whitney "the vote of the yard was practically coerced and controlled by the foremen. This instance shows how even in the Navy there were still instances of people going to great lengths for the desired elective to win.
Vote buying:
Vote buying occurs when a political party or candidate seeks to buy the vote of a voter in an upcoming election. Vote buying can take various forms such as a monetary exchange, as well as an exchange for necessary goods or services. This practice is often used to incentivise or persuade voters to turn out to elections and vote in a particular way.
Despite the fact that this practice is illegal in many countries such as the United States, Argentina, Mexico, Kenya, Brazil and Nigeria, its prevalence remains worldwide.
In some parts of the United States in the mid- and late 19th century, members of competing parties would vie, sometimes openly and other times with much greater secrecy, to buy and sell votes. Voters would be compensated with cash or the covering of one’s house/tax payment.
To keep the practice of vote buying secret, parties would open fully staffed vote-buying shops. Parties would also hire runners, who would go out into the public and find floating voters and bargain with them to vote for their side.
In England, documentation and stories of vote buying and vote selling are also well known.
The most famous episodes of vote buying came in 18th century England when two or more rich aristocrats spent whatever money it took to win. The notorious "Spendthrift election" came in Northamptonshire in 1768, when three earls spent over £100,000 each to win a seat.
Voters may be given money or other rewards for voting in a particular way, or not voting. In some jurisdictions, the offer or giving of other rewards is referred to as "electoral treating".
Electoral treating remains legal in some jurisdictions, such as in the Seneca Nation of Indians.
Click on anyof the following blue hyperlinks for more about Electoral Fraud:
- Examples of Vote buying
- In legislature
- Prevention
- Notable United States legislation
- See also:
- Administrative resource
- American Center for Voting Rights
- Branch stacking
- Caging list
- Cooping
- Electoral integrity
- Florida Central Voter File (purging controversy)
- Foreign electoral intervention
- Gerrymandering
- List of controversial elections
- List of UK Parliamentary election petitions
- Notable instances of ballot box stuffing
- Political corruption
- Postal voting
- Show election
- Smear campaign
- Voter suppression
- Carousel voting
- Bulgarian train
- Voter Fraud - an article from the ACE Project
- Independent Verification: Essential Action to Assure Integrity in the Voting Process, Roy G. Saltman, August 22, 2006
- Legal provisions to prevent Electoral Fraud - an article from the ACE Project
- Was the 2004 Election Stolen? by Robert F. Kennedy Jr., June 1, 2006.
Conservativism in the United States
- YouTube Video: Reagan And Bush Sure Sound Different On Immigration Than Today's Field
- YouTube Video: Black Conservatives Debate Black Liberals on American Politics
- YouTube Video: Conservatives and Progressives Debate Feminism, #MeToo, and Donald Trump
American Conservatism is a broad system of political beliefs in the United States that is characterized by respect for American traditions, support for Judeo-Christian values, economic liberalism, anti-communism, advocacy of American exceptionalism, and a defense of Western culture from threats posed by "creeping socialism", moral relativism, multiculturalism, and liberal internationalism.
Liberty is a core value, with a particular emphasis on strengthening the free market, limiting the size and scope of government, and opposition to high taxes and government or labor union encroachment on the entrepreneur.
American conservatives consider individual liberty, within the bounds of conformity to American values, as the fundamental trait of democracy, which contrasts with modern American liberals, who generally place a greater value on equality and social justice.
American conservatism originated from classical liberalism of 18th and 19th centuries, which advocates civil liberties and political freedom with representative democracy under the rule of law and emphasizes economic freedom.
Historians argue that the conservative tradition has played a major role in American politics and culture since the 1790s. However they have stressed that an organized conservative movement has played a key role in politics only since the 1950s. The recent movement is based in the Republican Party, though some Democrats were also important figures early in the movement's history.
The history of American conservatism has been marked by tensions and competing ideologies. Fiscal conservatives and libertarians favor small government involvement in the economy, a flat tax, limited regulation, and free enterprise.
Social conservatives see traditional social values as threatened by secularism; they tend to support voluntary school prayer and oppose abortion and same sex marriage. Some also want the teaching of intelligent design or creationism allowed, as the topics are currently judicially prohibited in public schools.
The 21st century has seen an increasingly fervent conservative support for Second Amendment rights of private citizens to own firearms. Neoconservatives want to expand American ideals throughout the world. Paleoconservatives advocate restrictions on immigration, non-interventionist foreign policy, and stand in opposition to multiculturalism.
Nationwide most factions, except some libertarians, support a unilateral foreign policy, and a strong military. The conservative movement of the 1950s attempted to bring together these divergent strands, stressing the need for unity to prevent the spread of "godless communism."
William F. Buckley Jr., in the first issue of his magazine National Review in 1955, explained the standards of his magazine and helped make explicit the beliefs of American conservatives:
"Among our convictions: It is the job of centralized government (in peacetime) to protect its citizens' lives, liberty and property. All other activities of government tend to diminish freedom and hamper progress. The growth of government (the dominant social feature of this century) must be fought relentlessly. In this great social conflict of the era, we are, without reservations, on the libertarian side.
The profound crisis of our era is, in essence, the conflict between the Social Engineers, who seek to adjust mankind to scientific utopias, and the disciples of Truth, who defend the organic moral order. We believe that truth is neither arrived at nor illuminated by monitoring election results, binding though these are for other purposes, but by other means, including a study of human experience. On this point we are, without reservations, on the conservative side."
Click on any of the following blue hyperlinks for more about Conservatism in the United States:
Liberty is a core value, with a particular emphasis on strengthening the free market, limiting the size and scope of government, and opposition to high taxes and government or labor union encroachment on the entrepreneur.
American conservatives consider individual liberty, within the bounds of conformity to American values, as the fundamental trait of democracy, which contrasts with modern American liberals, who generally place a greater value on equality and social justice.
American conservatism originated from classical liberalism of 18th and 19th centuries, which advocates civil liberties and political freedom with representative democracy under the rule of law and emphasizes economic freedom.
Historians argue that the conservative tradition has played a major role in American politics and culture since the 1790s. However they have stressed that an organized conservative movement has played a key role in politics only since the 1950s. The recent movement is based in the Republican Party, though some Democrats were also important figures early in the movement's history.
The history of American conservatism has been marked by tensions and competing ideologies. Fiscal conservatives and libertarians favor small government involvement in the economy, a flat tax, limited regulation, and free enterprise.
Social conservatives see traditional social values as threatened by secularism; they tend to support voluntary school prayer and oppose abortion and same sex marriage. Some also want the teaching of intelligent design or creationism allowed, as the topics are currently judicially prohibited in public schools.
The 21st century has seen an increasingly fervent conservative support for Second Amendment rights of private citizens to own firearms. Neoconservatives want to expand American ideals throughout the world. Paleoconservatives advocate restrictions on immigration, non-interventionist foreign policy, and stand in opposition to multiculturalism.
Nationwide most factions, except some libertarians, support a unilateral foreign policy, and a strong military. The conservative movement of the 1950s attempted to bring together these divergent strands, stressing the need for unity to prevent the spread of "godless communism."
William F. Buckley Jr., in the first issue of his magazine National Review in 1955, explained the standards of his magazine and helped make explicit the beliefs of American conservatives:
"Among our convictions: It is the job of centralized government (in peacetime) to protect its citizens' lives, liberty and property. All other activities of government tend to diminish freedom and hamper progress. The growth of government (the dominant social feature of this century) must be fought relentlessly. In this great social conflict of the era, we are, without reservations, on the libertarian side.
The profound crisis of our era is, in essence, the conflict between the Social Engineers, who seek to adjust mankind to scientific utopias, and the disciples of Truth, who defend the organic moral order. We believe that truth is neither arrived at nor illuminated by monitoring election results, binding though these are for other purposes, but by other means, including a study of human experience. On this point we are, without reservations, on the conservative side."
Click on any of the following blue hyperlinks for more about Conservatism in the United States:
- History
- Recent policies
- Types
- Electoral politics
- Other topics
- Historiography
- Thinkers and leaders
- See also Competing ideologies
Progressivism in the United States
*-- Franklin D. Roosevelt and the "New Deal"
**-- Lyndon B. Johnson and the "Great Society"
***-- Bill Clinton and his Economic Policies that achieved the only budget surplus (fiscal years 1998-2001) since!
- YouTube Video of Senator Senator Elizabeth Warren Remarks on Finance Industry Reform - Roosevelt Institute
- YouTube Video: Social Security: The Greatest Government Policy of All Time?
- YouTube Video: The Story of Medicare: A Timeline
*-- Franklin D. Roosevelt and the "New Deal"
**-- Lyndon B. Johnson and the "Great Society"
***-- Bill Clinton and his Economic Policies that achieved the only budget surplus (fiscal years 1998-2001) since!
Progressivism in the United States is a broadly based reform movement that reached its height early in the 20th century and is generally considered to be middle class and reformist in nature.
It arose as a response to the vast changes brought by modernization, such as the growth of large corporations and railroads, and fears of corruption in American politics.
In the 21st century, progressives continue to embrace concepts such as environmentalism and social justice. Social progressivism, the view that governmental practices ought to be adjusted as society evolves, forms the ideological basis for many American progressives.
Historian Alonzo Hamby defined progressivism as the "political movement that addresses ideas, impulses, and issues stemming from modernization of American society. Emerging at the end of the nineteenth century, it established much of the tone of American politics throughout the first half of the century."
Click on any of the following blue hyperlinks for more about Progressivism in the United States:
It arose as a response to the vast changes brought by modernization, such as the growth of large corporations and railroads, and fears of corruption in American politics.
In the 21st century, progressives continue to embrace concepts such as environmentalism and social justice. Social progressivism, the view that governmental practices ought to be adjusted as society evolves, forms the ideological basis for many American progressives.
Historian Alonzo Hamby defined progressivism as the "political movement that addresses ideas, impulses, and issues stemming from modernization of American society. Emerging at the end of the nineteenth century, it established much of the tone of American politics throughout the first half of the century."
Click on any of the following blue hyperlinks for more about Progressivism in the United States:
- Progressive Era
- Purifying the electorate
- Municipal administration
- Efficiency
- Regulation of large corporations and monopolies
- Social work
- Conservation
- National politics
- Culture
- Progressivism in the 21st century
- Other progressive parties
- See also:
- Affordable Care Act
- Center for American Progress
- Christian left
- Communist Party USA
- Conference for Progressive Political Action
- Congressional Progressive Caucus
- Democratic Party (United States)
- Democratic socialism
- Demos (U.S. think tank)
- Economic interventionism
- Great Society
- Green Party of the United States
- Justice Party (United States)
- League for Independent Political Action
- Modern liberalism in the United States
- Mount Vernon Statement
- New Deal
- Occupy Movement
- Progressive Christianity
- Progressive Era
- Progressive Party
- Progressive States Network
- Roosevelt Institute
- Social democracy
- Socialist Party USA
- Square Deal
- War on poverty
- Welfare state
Democratic Socialism, including the Time Magazine 10/24/2018 Article "Socialism Was Once America's Political Taboo. Now, Democratic Socialism Is a Viable Platform. Here's What to Know"
- YouTube Video: Sen. Bernie Sanders: "Democratic Socialist Ideas Are Mainstream"
- YouTube Video: The Difference Between Democratic Socialism And Socialism | NBC News NOW
- YouTube Video: How Senator Bernie Sanders clarified his vision of Democratic Socialism at third Democratic Debate
[Your WebHost: The reason for adding this topic "Democratic Socialism" is that Bernie Sanders, who is running for President in the 2020 elections, claims he is a Democratic Socialist!]
Socialism is a political, social and economic philosophy encompassing a range of economic and social systems characterised by social ownership of the means of production and workers' self-management of enterprises. It includes the political theories and movements associated with such systems.
Social ownership can be public, collective, cooperative or of equity. While no single definition encapsulates many types of socialism, social ownership is the one common element. It aims to circumvent the inefficiencies and crises traditionally associated with capital accumulation and the profit system in capitalism.
Socialist systems are divided into non-market and market forms. Non-market socialism involves replacing factor markets and money with engineering and technical criteria based on calculation performed in-kind, thereby producing an economic mechanism that functions according to different economic laws from those of capitalism.
The socialist calculation debate, originated by the economic calculation problem, concerns the feasibility and methods of resource allocation for a planned socialist system.
By contrast, market socialism retains the use of monetary prices, factor markets and in some cases the profit motive, with respect to the operation of socially owned enterprises and the allocation of capital goods between them. Profits generated by these firms would be controlled directly by the workforce of each firm or accrue to society at large in the form of a social dividend.
Socialist politics has been both internationalist and nationalist in orientation; organised through political parties and opposed to party politics; at times overlapping with trade unions and at other times independent and critical of them; and present in both industrialised and developing nations.
Social democracy originated within the socialist movement, supporting economic and social interventions to promote social justice. While retaining socialism as a long-term goal, since the post-war period it has come to embrace a Keynesian mixed economy within a predominantly developed capitalist market economy and liberal democratic polity that expands state intervention to include income redistribution, regulation and a welfare state.
Economic democracy proposes a sort of market socialism, with more democratic control of companies, currencies, investments and natural resources.
The socialist political movement includes a set of political philosophies that originated in the revolutionary movements of the mid-to-late 18th century and out of concern for the social problems that were associated with capitalism. By the late 19th century, after the work of Karl Marx and his collaborator Friedrich Engels, socialism had come to signify opposition to capitalism and advocacy for a post-capitalist system based on some form of social ownership of the means of production.
By the 1920s, communism and social democracy had become the two dominant political tendencies within the international socialist movement, with socialism itself becoming "the most influential secular movement of the twentieth century".
While the emergence of the Soviet Union as the world's first nominally socialist state led to socialism's widespread association with the Soviet economic model, some economists and intellectuals argued that in practice the model functioned as a form of state capitalism or a non-planned administrative or command economy.
Socialist parties and ideas remain a political force with varying degrees of power and influence on all continents, heading national governments in many countries around the world. Today, many socialists have also adopted the causes of other social movements such as environmentalism, feminism and progressivism.
Click on any of the following blue hyperlinks for more about Socialism in the United States:
Socialism is a political, social and economic philosophy encompassing a range of economic and social systems characterised by social ownership of the means of production and workers' self-management of enterprises. It includes the political theories and movements associated with such systems.
Social ownership can be public, collective, cooperative or of equity. While no single definition encapsulates many types of socialism, social ownership is the one common element. It aims to circumvent the inefficiencies and crises traditionally associated with capital accumulation and the profit system in capitalism.
Socialist systems are divided into non-market and market forms. Non-market socialism involves replacing factor markets and money with engineering and technical criteria based on calculation performed in-kind, thereby producing an economic mechanism that functions according to different economic laws from those of capitalism.
The socialist calculation debate, originated by the economic calculation problem, concerns the feasibility and methods of resource allocation for a planned socialist system.
By contrast, market socialism retains the use of monetary prices, factor markets and in some cases the profit motive, with respect to the operation of socially owned enterprises and the allocation of capital goods between them. Profits generated by these firms would be controlled directly by the workforce of each firm or accrue to society at large in the form of a social dividend.
Socialist politics has been both internationalist and nationalist in orientation; organised through political parties and opposed to party politics; at times overlapping with trade unions and at other times independent and critical of them; and present in both industrialised and developing nations.
Social democracy originated within the socialist movement, supporting economic and social interventions to promote social justice. While retaining socialism as a long-term goal, since the post-war period it has come to embrace a Keynesian mixed economy within a predominantly developed capitalist market economy and liberal democratic polity that expands state intervention to include income redistribution, regulation and a welfare state.
Economic democracy proposes a sort of market socialism, with more democratic control of companies, currencies, investments and natural resources.
The socialist political movement includes a set of political philosophies that originated in the revolutionary movements of the mid-to-late 18th century and out of concern for the social problems that were associated with capitalism. By the late 19th century, after the work of Karl Marx and his collaborator Friedrich Engels, socialism had come to signify opposition to capitalism and advocacy for a post-capitalist system based on some form of social ownership of the means of production.
By the 1920s, communism and social democracy had become the two dominant political tendencies within the international socialist movement, with socialism itself becoming "the most influential secular movement of the twentieth century".
While the emergence of the Soviet Union as the world's first nominally socialist state led to socialism's widespread association with the Soviet economic model, some economists and intellectuals argued that in practice the model functioned as a form of state capitalism or a non-planned administrative or command economy.
Socialist parties and ideas remain a political force with varying degrees of power and influence on all continents, heading national governments in many countries around the world. Today, many socialists have also adopted the causes of other social movements such as environmentalism, feminism and progressivism.
Click on any of the following blue hyperlinks for more about Socialism in the United States:
- Etymology
- History
- Contemporary socialist politics
- Social and political theory
- Economics
- Politics
- Criticism
- See also:
Domestic violence in the United States Pictured below:Victims of domestic violence
Domestic violence in United States is a form of violence that occurs within a domestic relationship.
Although domestic violence often occurs between partners in the context of an intimate relationship, it may also describe other household violence, such as violence against a child, by a child against a parent or violence between siblings in the same household.
It is recognized as an important social problem by governmental and non-governmental agencies, and various Violence Against Women Acts have been passed by the US Congress in an attempt to stem this tide.
Victimization from domestic violence transcends the boundaries of gender and sexual orientation. Women are more often the victims of domestic violence, and are more likely than men to suffer injuries or health consequences as a result of the incidents, but men are also subject to domestic violence in significant numbers, including in incidents of physical partner violence.
Significant percentages of LGBT couples also face domestic violence issues. Social and economically disadvantaged groups in the U.S. regularly face worse rates of domestic violence than other groups. For example, about 60% of Native American women are physically assaulted in their lifetime by a partner or spouse.
Many scholarly studies of the problem have stated that domestic violence is often part of a dynamic of control and oppression in relationships, regularly involving multiple forms of physical and non-physical abuse taking place concurrently. Intimate terrorism is an ongoing, complicated use of control, power and abuse in which one person tries to assert systematic control over another psychologically.
Homeless shelters exist in many states as well as special hotlines for people to call for immediate assistance, with non-profit agencies trying to fight the stigma that people both face in reporting these issues.
Definitions:
According to the Merriam-Webster dictionary definition, domestic violence is: "the inflicting of physical injury by one family or household member on another; also: a repeated or habitual pattern of such behavior."
The following definition applies for the purposes of subchapter III of chapter 136 of title 42 of the US Code: "The term 'domestic violence' includes felony or misdemeanor crimes of violence committed by a current or former spouse of the victim, by a person with whom the victim shares a child in common, by a person who is cohabitating with or has cohabitated with the victim as a spouse, by a person similarly situated to a spouse of the victim under the domestic or family violence laws of the jurisdiction receiving grant monies, or by any other person against an adult or youth victim who is protected from that person’s acts under the domestic or family violence laws of the jurisdiction."
It was inserted into the Violence Against Women Act of 1994 by section 3(a) of the Violence Against Women and Department of Justice Reauthorization Act of 2005.
It also applies for the purposes of section 7275 of subpart 17 of Part D of subchapter V of chapter 70 of title 20, section 1437F of subchapter I of chapter 8 of title 42, and subchapter XII-H of chapter 46 of title 42 of the US Code.
It is also the definition used by the US Office on Violence Against Women (OVW)
Globally:
Kofi Annan, Secretary-General of the United Nations, declared in a 2006 report posted on the United Nations Development Fund for Women (UNIFEM) website that:
Violence against women and girls is a problem of pandemic proportions. At least one out of every three women around the world has been beaten, coerced into sex, or otherwise abused in her lifetime with the abuser usually someone known to her.
Forms:
Main article: Domestic violence
Domestic violence may include verbal, emotional, economic, physical and sexual abuse. All forms of domestic abuse have one purpose: to gain and maintain control over the victim. Abusers use many tactics to exert power over their spouse or partner: dominance, humiliation, isolation, threats, intimidation, denial and blame.
The dynamics between the couple may include:
Incidence:
Between 960,000 and 3,000,000 incidents of domestic violence are reported each year, while many other incidents go unreported. It is estimated that more than ten million people experience domestic violence in the U.S. each year.
The ten states with the highest rate of females murdered by males were, as of 2010,
In 2009, for homicides in which the victim to offender relationship could be identified, 93% of female victims were murdered by a male they knew, 63% of them in the context of an intimate relationship.
Several studies in the U.S. have found that domestic violence is more common in the families of police officers than in the general population. Early studies found that between 24% and 40% of participating families of police officers reported incidents of domestic violence.
Subsequent studies suggested possible rates of officer-involved domestic violence that ranged from 4.8% to 28%, meaning the rate could be the same as that of the general public. The prevalence of domestic violence in law enforcement is important, as police attitudes toward domestic violence affect the quality of police intervention in domestic violence situations.
Gender aspects of abuse:
See also: Domestic violence § Gender differences
In the United States, according to the Bureau of Justice Statistics in 1995 women reported a six times greater rate of intimate partner violence than men.
The National Crime Victimization Survey (NCVS) indicates that in 1998 about 876,340 violent crimes were committed in the U.S. against women by their current or former spouses, or boyfriends.
According to the Centers for Disease Control and Prevention, in the United States 4.8 million women suffer intimate partner related physical assaults and rapes and 2.9 million men are victims of physical assault from their partners.
Research reviews have concluded that the majority of women's physical violence against men is in response to being abused by those men. A 2010 systematic review of the literature on women's perpetration of IPV found that anger, self-defense and retaliation were common motivations but that distinguishing between self-defense and retaliation was difficult.
With regard to similar rates and a difference in methods, a study compiled by Martin S. Fiebert, suggested that women are as likely to be violent to men, but that men are less likely to be hurt. He, however, stated that men are seriously injured in 38 percent of the cases in which "extreme aggression" is used. Fiebert additionally stated that his work was not meant to minimize the serious effects of men who abuse women.
A 2011 review by researcher Chan Ko Ling from the University of Hong Kong found that perpetration of minor partner violence was equal for both men and women but more severe partner violence was far likelier to be perpetrated by men. His analysis found that men were more likely to beat up, choke or strangle their partners while women were more likely to throw objects, slap, kick, bite, punch, or hit with an object.
The Bureau of Justice Statistics also reported that women are far more likely to use objects. The National Institute of Justice contends that national surveys supported by NIJ, the Centers for Disease Control and Prevention, and the Bureau of Justice Statistics that examine more serious assaults do not support the conclusion of similar rates of male and female spousal assaults. These surveys are conducted within a safety or crime context and find more partner abuse by men against women.
Straus and Gelles found that in couples reporting spousal violence, 27 percent of the time the man struck the first blow; in 24 percent of cases, the woman initiated the violence. The rest of the time, the violence was mutual, with both partners brawling. The results were the same even when the most severe episodes of violence were analyzed. In order to counteract claims that the reporting data was skewed, female-only surveys were conducted, asking females to self-report, and the data was the same.
The simple tally of physical acts is typically found to be similar in those studies that examine both directions, but studies show that male violence is more serious. Male violence does more damage than female violence; women are more likely to be injured and/or hospitalized.
Women are more likely to be killed by an intimate than the reverse (according to the Department of Justice, the rate is 63.7% to 36.3%), and women in general are more likely to be killed by their spouses than by all other types of assailants combined.
Studies have found that men are much less likely to report victimization in these situations.
According to some studies, less than 1% of domestic violence cases are reported to the police. In the United States 10–35% of the population will be physically aggressive towards a partner at some point in their lives. As abuse becomes more severe women become increasingly overrepresented as victims. The National Violence Against Women Survey for 2000 reported that 25% of women and 7.6% of men reported being victims of intimate partner violence at some point in their lives.
The rate of intimate partner violence in the U. S. has declined since 1993.
In a 2006 Amnesty International report, The Maze of Injustice: The Failure to Protect Indigenous Women From Sexual Violence in the USA, the data for Native women indicates high levels of sexual violence. Statistics gathered by the U.S. government reveal that Native American and Alaska Native women are more than 2.5 times more likely to be sexual assaulted than women in the United States in general; more than one in three Native women will be raped in their lifetime.
Statistics Over a year:
During pregnancy:
Main article: Domestic violence and pregnancy
The United States was one of the countries identified by a United Nations study with a high rate of domestic violence resulting in death during pregnancy.
During one's lifetime:
Injury:
In 1992, domestic violence was the leading cause of injury for women between 15 and 44; more than rapes, muggings, and car accidents combined. The levels of domestic injury against men have not been investigated to the same extent.
Rape:
Main article: Rape
Murder:
Women are more likely than men to be murdered by an intimate partner. Of those killed by an intimate partner, about three quarters are female and about a quarter are male. In 1999 in the United States, 1,218 women and 424 men were killed by an intimate partner, and 1,181 females and 329 males were killed by their intimate partners in 2005. In 2007, 2,340 deaths were caused by intimate partner violence—making up 14% of all homicides. 70% of these deaths were females and 30% were males.
Dating violence:
Main articles: Dating abuse and Teen dating violence
Dating violence is often a precursor to domestic violence. 22% of high school girls and 32% of college women experienced dating violence in a 2000 study. 20.6% of women experienced two or more types of dating violence and 8.3% of women experienced rape, stalking or physical aggression while dating. The levels of dating violence against men has not been investigated to the same extent.
Stalking:
According to the Centers for Disease Control National Intimate Partner Violence Survey results of 2010, 1 in 6 women (15.2%) have been stalked during their lifetime, compared to 1 in 19 men (5.7%).
Additionally, 1 in 3 bisexual women (37%) and 1 in 6 heterosexual women (16%) have experienced stalking victimization in their lifetime during which they felt very fearful or believed that they or someone close to them would be harmed or killed.
Socio-economic impacts:
While domestic violence crosses all socio-economic classes, Intimate Terrorism (IT) is more prevalent among poor people. When evaluating situational couple violence, poor people, subject to greater strains, have the highest percentage of situational couple violence, which does not necessarily involve serious violence.
Regarding ethnicity, socio-economic standing and other factors often have more to do with rates of domestic violence. When comparing the African American population to European Americans by socio-economic class, the rates of domestic violence are roughly the same. Since there are more poor African Americans, though, there is a higher incidence of domestic violence overall. It is not possible to evaluate the rate of domestic violence by ethnicity alone, because of the variability of cultural, economic and historical influences and the forms of domestic violence (situational couple violence, intimate terrorism) affecting each population of people.
Effects on children:
Up to 10 to 20% children in the United States witness abuse of a parent or caregiver annually. As a result, they are more likely to experience neglect or abuse, less likely to succeed at school, have poor problem-solving skills, subject to higher incidence of emotional and behavioral problems, and more likely to tolerate violence in their adult relationships.
Complicating this already bleak picture, parental psychopathology in the wake of domestic violence can further compromise the quality of parenting, and in turn increase the risk for the child's developing emotional and behavioral difficulties if mental health care is not sought.
Homelessness:
Main article: Homeless women in the United States
According to the authors of "Housing Problems and Domestic Violence," 38% of domestic violence victims will become homeless in their lifetime. Domestic violence is the direct cause of homelessness for over half of all homeless women in the United States. According to the U.S. Department of Housing and Urban Development, domestic violence is the third leading cause of homelessness among families.
Economic impacts:
Economic abuse can occur across all socio-economic levels.
The National Coalition Against Domestic Violence in the United States reports that:
The Centers for Disease Control has released that the medical care, mental health services, and lost productivity cost of intimate partner violence was an estimated $8.3 billion in 2003 dollars for women alone.
Religion:
Main articles: Christianity and domestic violence and Islam and domestic violence
One 2004 study by William Bradford Wilcox examined the relationship between religious affiliation, church attendance, and domestic violence, using data on wives' reports of spousal violence from three national United States surveys conducted between 1992 and 1994.
The study found that the lowest reported rates of domestic violence occurred among active conservative Protestants (2.8% of husbands committed domestic violence), followed by those who were religiously unaffiliated (3.2%), nominal mainline Protestants (3.9%), active mainline Protestants (5.4%), and nominal conservative Protestants (7.2%).
Overall (including both nominal and active members), the rates among conservative Protestants and mainline Protestants were 4.8% and 4.3%, respectively.
Examining Wilcox's study, Van Leewen finds that the parenting style of conservative Protestant fathers is characterized by features which have been linked to positive outcomes among children and adolescents, that there is no evidence that gender-traditionalist ideology of the "soft patriarchal" kind is a strong predictor of domestic physical abuse, and that "gender hierarchicalist males" who are frequent and active church members function positively in the domestic environment.
Another 2007 study by Christopher G. Ellison suggested that "religious involvement, specifically church attendance, protects against domestic violence, and this protective effect is stronger for African American men and women and for Hispanic men, groups that, for a variety of reasons, experience elevated risk for this type of violence."
Click on any of the following blue hyperlinks for more about domestic violence in the United States:
Although domestic violence often occurs between partners in the context of an intimate relationship, it may also describe other household violence, such as violence against a child, by a child against a parent or violence between siblings in the same household.
It is recognized as an important social problem by governmental and non-governmental agencies, and various Violence Against Women Acts have been passed by the US Congress in an attempt to stem this tide.
Victimization from domestic violence transcends the boundaries of gender and sexual orientation. Women are more often the victims of domestic violence, and are more likely than men to suffer injuries or health consequences as a result of the incidents, but men are also subject to domestic violence in significant numbers, including in incidents of physical partner violence.
Significant percentages of LGBT couples also face domestic violence issues. Social and economically disadvantaged groups in the U.S. regularly face worse rates of domestic violence than other groups. For example, about 60% of Native American women are physically assaulted in their lifetime by a partner or spouse.
Many scholarly studies of the problem have stated that domestic violence is often part of a dynamic of control and oppression in relationships, regularly involving multiple forms of physical and non-physical abuse taking place concurrently. Intimate terrorism is an ongoing, complicated use of control, power and abuse in which one person tries to assert systematic control over another psychologically.
Homeless shelters exist in many states as well as special hotlines for people to call for immediate assistance, with non-profit agencies trying to fight the stigma that people both face in reporting these issues.
Definitions:
According to the Merriam-Webster dictionary definition, domestic violence is: "the inflicting of physical injury by one family or household member on another; also: a repeated or habitual pattern of such behavior."
The following definition applies for the purposes of subchapter III of chapter 136 of title 42 of the US Code: "The term 'domestic violence' includes felony or misdemeanor crimes of violence committed by a current or former spouse of the victim, by a person with whom the victim shares a child in common, by a person who is cohabitating with or has cohabitated with the victim as a spouse, by a person similarly situated to a spouse of the victim under the domestic or family violence laws of the jurisdiction receiving grant monies, or by any other person against an adult or youth victim who is protected from that person’s acts under the domestic or family violence laws of the jurisdiction."
It was inserted into the Violence Against Women Act of 1994 by section 3(a) of the Violence Against Women and Department of Justice Reauthorization Act of 2005.
It also applies for the purposes of section 7275 of subpart 17 of Part D of subchapter V of chapter 70 of title 20, section 1437F of subchapter I of chapter 8 of title 42, and subchapter XII-H of chapter 46 of title 42 of the US Code.
It is also the definition used by the US Office on Violence Against Women (OVW)
Globally:
Kofi Annan, Secretary-General of the United Nations, declared in a 2006 report posted on the United Nations Development Fund for Women (UNIFEM) website that:
Violence against women and girls is a problem of pandemic proportions. At least one out of every three women around the world has been beaten, coerced into sex, or otherwise abused in her lifetime with the abuser usually someone known to her.
Forms:
Main article: Domestic violence
Domestic violence may include verbal, emotional, economic, physical and sexual abuse. All forms of domestic abuse have one purpose: to gain and maintain control over the victim. Abusers use many tactics to exert power over their spouse or partner: dominance, humiliation, isolation, threats, intimidation, denial and blame.
The dynamics between the couple may include:
- Situational couple violence, which arises out of conflicts that escalate to arguments and then to violence, is not connected to a general pattern of control, generally infrequent, and likely the most common type of intimate partner violence. Women are as likely as men to be abusers, however, women are somewhat more likely to be physically injured, but are also more likely to successfully find police intervention.
- Intimate terrorism (IT), involves a pattern of ongoing control using emotional, physical and other forms of domestic violence and is what generally leads victims. It is what was traditionally the definition of domestic violence and is generally illustrated with the "Power and Control Wheel" to illustrate the different and inter-related forms of abuse.
- Violent resistance (VR), or "self-defense", is violence perpetrated by victims against their abusive partners.
- Common couple violence, where both partners are engaged in domestic violence actions.
- Mutual violent control (MVC) is a rare type of intimate partner violence that occurs when both partners act in a violent manner, battling for control,
Incidence:
Between 960,000 and 3,000,000 incidents of domestic violence are reported each year, while many other incidents go unreported. It is estimated that more than ten million people experience domestic violence in the U.S. each year.
The ten states with the highest rate of females murdered by males were, as of 2010,
- Nevada,
South Carolina,
Tennessee,
Louisiana,
Virginia,
Texas,
New Mexico
, Hawaii,
Arizona,
Georgia.
In 2009, for homicides in which the victim to offender relationship could be identified, 93% of female victims were murdered by a male they knew, 63% of them in the context of an intimate relationship.
Several studies in the U.S. have found that domestic violence is more common in the families of police officers than in the general population. Early studies found that between 24% and 40% of participating families of police officers reported incidents of domestic violence.
Subsequent studies suggested possible rates of officer-involved domestic violence that ranged from 4.8% to 28%, meaning the rate could be the same as that of the general public. The prevalence of domestic violence in law enforcement is important, as police attitudes toward domestic violence affect the quality of police intervention in domestic violence situations.
Gender aspects of abuse:
See also: Domestic violence § Gender differences
In the United States, according to the Bureau of Justice Statistics in 1995 women reported a six times greater rate of intimate partner violence than men.
The National Crime Victimization Survey (NCVS) indicates that in 1998 about 876,340 violent crimes were committed in the U.S. against women by their current or former spouses, or boyfriends.
According to the Centers for Disease Control and Prevention, in the United States 4.8 million women suffer intimate partner related physical assaults and rapes and 2.9 million men are victims of physical assault from their partners.
Research reviews have concluded that the majority of women's physical violence against men is in response to being abused by those men. A 2010 systematic review of the literature on women's perpetration of IPV found that anger, self-defense and retaliation were common motivations but that distinguishing between self-defense and retaliation was difficult.
With regard to similar rates and a difference in methods, a study compiled by Martin S. Fiebert, suggested that women are as likely to be violent to men, but that men are less likely to be hurt. He, however, stated that men are seriously injured in 38 percent of the cases in which "extreme aggression" is used. Fiebert additionally stated that his work was not meant to minimize the serious effects of men who abuse women.
A 2011 review by researcher Chan Ko Ling from the University of Hong Kong found that perpetration of minor partner violence was equal for both men and women but more severe partner violence was far likelier to be perpetrated by men. His analysis found that men were more likely to beat up, choke or strangle their partners while women were more likely to throw objects, slap, kick, bite, punch, or hit with an object.
The Bureau of Justice Statistics also reported that women are far more likely to use objects. The National Institute of Justice contends that national surveys supported by NIJ, the Centers for Disease Control and Prevention, and the Bureau of Justice Statistics that examine more serious assaults do not support the conclusion of similar rates of male and female spousal assaults. These surveys are conducted within a safety or crime context and find more partner abuse by men against women.
Straus and Gelles found that in couples reporting spousal violence, 27 percent of the time the man struck the first blow; in 24 percent of cases, the woman initiated the violence. The rest of the time, the violence was mutual, with both partners brawling. The results were the same even when the most severe episodes of violence were analyzed. In order to counteract claims that the reporting data was skewed, female-only surveys were conducted, asking females to self-report, and the data was the same.
The simple tally of physical acts is typically found to be similar in those studies that examine both directions, but studies show that male violence is more serious. Male violence does more damage than female violence; women are more likely to be injured and/or hospitalized.
Women are more likely to be killed by an intimate than the reverse (according to the Department of Justice, the rate is 63.7% to 36.3%), and women in general are more likely to be killed by their spouses than by all other types of assailants combined.
Studies have found that men are much less likely to report victimization in these situations.
According to some studies, less than 1% of domestic violence cases are reported to the police. In the United States 10–35% of the population will be physically aggressive towards a partner at some point in their lives. As abuse becomes more severe women become increasingly overrepresented as victims. The National Violence Against Women Survey for 2000 reported that 25% of women and 7.6% of men reported being victims of intimate partner violence at some point in their lives.
The rate of intimate partner violence in the U. S. has declined since 1993.
In a 2006 Amnesty International report, The Maze of Injustice: The Failure to Protect Indigenous Women From Sexual Violence in the USA, the data for Native women indicates high levels of sexual violence. Statistics gathered by the U.S. government reveal that Native American and Alaska Native women are more than 2.5 times more likely to be sexual assaulted than women in the United States in general; more than one in three Native women will be raped in their lifetime.
Statistics Over a year:
- 1% of all women (age > 18) who participated in a UN national study in 1995–96, who may or may not have been married or partnered, were victims of domestic abuse within the previous 12-month period. Since this population included women who had never been partnered, the prevalence of domestic violence may have been greater.
- A report by the United States Department of Justice in 2000 found that 1.3% of women and 0.9% of men reported experiencing domestic violence in the past year.
- About 2.3 million people are raped or physically assaulted each year by a current or former intimate partner or spouse.
- Physically assaulted women receive an average of 6.9 physical assaults by the same partner per year.
During pregnancy:
Main article: Domestic violence and pregnancy
The United States was one of the countries identified by a United Nations study with a high rate of domestic violence resulting in death during pregnancy.
During one's lifetime:
- According to The Centers for Disease Control and Prevention and The National Institute of Justice, nearly 25% of women experience at least one physical assault during adulthood by a partner.
- 22% of the women had been subject to domestic violence during some period of their life, according to a United Nations study. Since this population included women who had never been married or partnered, the prevalence of domestic violence may have been greater.ccording to a report by the United States Department of Justice in 2000, a survey of 16,000 Americans showed 22.1 percent of women and 7.4 percent of men reported being physically assaulted by a current or former spouse, cohabiting partner, boyfriend or girlfriend, or date in their lifetime.
- 60% of American Indian and Alaska Native women will be physically assaulted in their lifetime.
- A 2013 report by the American Centers for Disease Control and Prevention (CDC) found that 26% of male homosexuals and 44% of lesbians surveyed reported experiencing intimate partner violence. The study evaluated 2010 data from the National Intimate Partner and Sexual Violence Survey, which involved over 16,000 U.S. adults.
Injury:
In 1992, domestic violence was the leading cause of injury for women between 15 and 44; more than rapes, muggings, and car accidents combined. The levels of domestic injury against men have not been investigated to the same extent.
Rape:
Main article: Rape
- 1 in 71 men and 1 in 5 women have experienced an attempted or completed rape committed by a partner. More than one in three American Indian and Alaska Native women will be raped in their lifetimes.
- A 2013 CDC study stated that 28% of straight women who had been raped experienced their first rape as a child, with the crime taking place between the ages of 11 and 17.
Murder:
Women are more likely than men to be murdered by an intimate partner. Of those killed by an intimate partner, about three quarters are female and about a quarter are male. In 1999 in the United States, 1,218 women and 424 men were killed by an intimate partner, and 1,181 females and 329 males were killed by their intimate partners in 2005. In 2007, 2,340 deaths were caused by intimate partner violence—making up 14% of all homicides. 70% of these deaths were females and 30% were males.
Dating violence:
Main articles: Dating abuse and Teen dating violence
Dating violence is often a precursor to domestic violence. 22% of high school girls and 32% of college women experienced dating violence in a 2000 study. 20.6% of women experienced two or more types of dating violence and 8.3% of women experienced rape, stalking or physical aggression while dating. The levels of dating violence against men has not been investigated to the same extent.
Stalking:
According to the Centers for Disease Control National Intimate Partner Violence Survey results of 2010, 1 in 6 women (15.2%) have been stalked during their lifetime, compared to 1 in 19 men (5.7%).
Additionally, 1 in 3 bisexual women (37%) and 1 in 6 heterosexual women (16%) have experienced stalking victimization in their lifetime during which they felt very fearful or believed that they or someone close to them would be harmed or killed.
Socio-economic impacts:
While domestic violence crosses all socio-economic classes, Intimate Terrorism (IT) is more prevalent among poor people. When evaluating situational couple violence, poor people, subject to greater strains, have the highest percentage of situational couple violence, which does not necessarily involve serious violence.
Regarding ethnicity, socio-economic standing and other factors often have more to do with rates of domestic violence. When comparing the African American population to European Americans by socio-economic class, the rates of domestic violence are roughly the same. Since there are more poor African Americans, though, there is a higher incidence of domestic violence overall. It is not possible to evaluate the rate of domestic violence by ethnicity alone, because of the variability of cultural, economic and historical influences and the forms of domestic violence (situational couple violence, intimate terrorism) affecting each population of people.
Effects on children:
Up to 10 to 20% children in the United States witness abuse of a parent or caregiver annually. As a result, they are more likely to experience neglect or abuse, less likely to succeed at school, have poor problem-solving skills, subject to higher incidence of emotional and behavioral problems, and more likely to tolerate violence in their adult relationships.
Complicating this already bleak picture, parental psychopathology in the wake of domestic violence can further compromise the quality of parenting, and in turn increase the risk for the child's developing emotional and behavioral difficulties if mental health care is not sought.
Homelessness:
Main article: Homeless women in the United States
According to the authors of "Housing Problems and Domestic Violence," 38% of domestic violence victims will become homeless in their lifetime. Domestic violence is the direct cause of homelessness for over half of all homeless women in the United States. According to the U.S. Department of Housing and Urban Development, domestic violence is the third leading cause of homelessness among families.
Economic impacts:
Economic abuse can occur across all socio-economic levels.
The National Coalition Against Domestic Violence in the United States reports that:
- 25% - 50% of victims of abuse from a partner have lost their job due to domestic violence.
- 35% - 56% of victims of domestic violence are harassed at work by their partners.
- More than 1.75 million workdays are lost each year to domestic violence. Lost productivity due to missed workdays and decreased productivity, with increased health and safety costs, results in a loss of $3 to $5 billion each year.
The Centers for Disease Control has released that the medical care, mental health services, and lost productivity cost of intimate partner violence was an estimated $8.3 billion in 2003 dollars for women alone.
Religion:
Main articles: Christianity and domestic violence and Islam and domestic violence
One 2004 study by William Bradford Wilcox examined the relationship between religious affiliation, church attendance, and domestic violence, using data on wives' reports of spousal violence from three national United States surveys conducted between 1992 and 1994.
The study found that the lowest reported rates of domestic violence occurred among active conservative Protestants (2.8% of husbands committed domestic violence), followed by those who were religiously unaffiliated (3.2%), nominal mainline Protestants (3.9%), active mainline Protestants (5.4%), and nominal conservative Protestants (7.2%).
Overall (including both nominal and active members), the rates among conservative Protestants and mainline Protestants were 4.8% and 4.3%, respectively.
Examining Wilcox's study, Van Leewen finds that the parenting style of conservative Protestant fathers is characterized by features which have been linked to positive outcomes among children and adolescents, that there is no evidence that gender-traditionalist ideology of the "soft patriarchal" kind is a strong predictor of domestic physical abuse, and that "gender hierarchicalist males" who are frequent and active church members function positively in the domestic environment.
Another 2007 study by Christopher G. Ellison suggested that "religious involvement, specifically church attendance, protects against domestic violence, and this protective effect is stronger for African American men and women and for Hispanic men, groups that, for a variety of reasons, experience elevated risk for this type of violence."
Click on any of the following blue hyperlinks for more about domestic violence in the United States:
- Religion
- History
- Laws
- Law enforcement
- State due diligence
- Freedom from domestic violence resolution movement
- Support organizations
- Reduction programs
- See also:
- Uniform Child Abduction Prevention Act
- Honor killing in the United States
- Legal remedies:
- Address confidentiality program, some states in the United States
- Injunction
- Restraining order
- Organizations:
- AHA Foundation (Muslim women's rights in western countries)
- Futures Without Violence
- National Coalition Against Domestic Violence
- National Network to End Domestic Violence
- Peaceful Families Project (Muslim organization)
- Stop Abuse For Everyone, inclusive of all types of domestic violence victims: age, LBGT, gender, etc.
- Tahirih Justice Center
- Convicted Women Against Abuse
- General:
Different Forms of Sanctuary
TOP:
(L) Farm animal sanctuaries provide refuge for homeless livestock
(R) US Churches Are Opening Doors To The Homeless, Giving Them A Safe Place To Sleep And Survive
BOTTOM:
(L) Breathe: Escape to the Privacy of your Backyard Sanctuary
(R) National Aquarium’s Dolphin Sanctuary
- YouTube Video: Down on the Farm: With Farm Animal Whisperer Susie Coston
- YouTube Video: How sanctuary cities actually work
- YouTube Video: When Homelessness Reaches Middle-Class Working Families | Megyn Kelly TODAY
TOP:
(L) Farm animal sanctuaries provide refuge for homeless livestock
(R) US Churches Are Opening Doors To The Homeless, Giving Them A Safe Place To Sleep And Survive
BOTTOM:
(L) Breathe: Escape to the Privacy of your Backyard Sanctuary
(R) National Aquarium’s Dolphin Sanctuary
A sanctuary, in its original meaning, is a sacred place, such as a shrine. By the use of such places as a haven, by extension the term has come to be used for any place of safety. This secondary use can be categorized into human sanctuary, a safe place for humans, such as a political sanctuary; and non-human sanctuary, such as an animal or plant sanctuary.
Religious sanctuary
Sanctuary is a word derived from the Latin sanctuarium, which is, like most words ending in -arium, a container for keeping something in—in this case holy things or perhaps cherished people (sanctae/sancti).
The meaning was extended to places of holiness or safety, in particular the whole demarcated area, often many acres, surrounding a Greek or Roman temple; the original terms for these are temenos in Greek and fanum in Latin, but both may be translated as sanctuary. Similar usage may be sometimes found describing sacred areas in other religions.
In Christian churches "sanctuary" has a specific meaning, covering part of the interior, covered below.
Sanctuary as area around the altar:
Main article: Chancel
In many Western Christian traditions including Catholic, Lutheran, Methodist, and Anglican churches, the area around the altar is called the sanctuary; it is also considered holy because of the physical presence of God in the Eucharist, both during the Mass and in the church tabernacle the rest of the time.
In many churches the architectural term chancel covers the same area as the sanctuary, and either term may be used. In some Protestant churches, the term sanctuary denotes the entire worship area while the term chancel is used to refer to the area around the altar-table.
In many Western traditions altar rails sometimes mark the edge of the sanctuary or chancel.
In the Eastern Orthodox Church, Eastern Catholic Churches of Syro-Malabar Church, Byzantine rite and Coptic Orthodox Churches, the sanctuary is separated from the nave (where the people pray) by an iconostasis, literally a wall of icons, with three doors in it. In other Oriental Orthodox traditions, a sanctuary curtain is used.
The terminology that applies the word "sanctuary" to the area around the altar does not apply to Christian churches alone: King Solomon's temple, built in about 950 BC, had a sanctuary ("Holy of Holies") where the Ark of the Covenant was, and the term applies to the corresponding part of any house of worship.
In most modern synagogues, the main room for prayer is known as the sanctuary, to contrast it with smaller rooms dedicated to various other services and functions. (There is a raised bimah in the sanctuary, from which services are conducted, which is where the ark holding the Torah may reside; some synagogues, however, have a separate bimah and ark-platform.)
Sanctuary as a sacred place:
In Europe, Christian churches were sometimes built on land considered to be a particularly holy spot, perhaps where a miracle or martyrdom was believed to have taken place or where a holy person was buried. Examples are St. Peter's Basilica in Rome and St. Albans Cathedral in England, which commemorate the martyrdom of Saint Peter (the first Pope) and Saint Alban (the first Christian martyr in Britain), respectively.
The place, and therefore the church built there, was considered to have been sanctified (made holy) by what happened there. In modern times, the Catholic Church has continued this practice by placing in the altar of each church, when it is consecrated for use, a box (the sepulcrum) containing relics of a saint.
The relics box is removed when the church is taken out of use as a church. In the Eastern Orthodox Church, the antimension on the altar serves a similar function. It is a cloth icon of Christ's body taken down from the cross, and typically has the relics of a saint sewn into it. In addition, it is signed by the parish's bishop, and represents his authorization and blessing for the Eucharist to be celebrated on that altar.
Human sanctuary:
Traditions of Sanctuary:
Although the word "sanctuary" is often traced back only as far as the Greek and Roman empires, the concept itself has likely been part of human cultures for thousands of years. The idea that persecuted persons should be given a place of refuge is ancient, perhaps even primordial, deriving itself from basic features of human altruism.
In studying the concept across many cultures and times, anthropologists have found sanctuary to be a highly universal notion, one which appears in almost all major religious traditions and in a variety of diverse geographies.
"Cities of refuge" as described by the Book of Numbers and Deuteronomy in the Old Testament, as well as the Bedouin idea of nazaala, or the "taking of refuge," indicate a strong tradition of sanctuary in the Middle East and Northern Africa. In the Americas, many native tribes shared similar practices, particularly in the face of invading European powers. Despite tensions between groups, many tribes still offered and received sanctuary, taking in those who had fled their tribal lands or feared persecution by the Spanish, English, and French.
Legal sanctuary:
In the classical world, some (but not all) temples offered sanctuary to criminals or runaway slaves. When referring to prosecution of crimes, sanctuary can mean one of the following:
Church sanctuary:
Main article: Place of Refuge § Medieval England
A sacred place, such as a church, in which fugitives formerly were immune to arrest (recognized by English law from the fourth to the seventeenth century). While the practice of churches offering sanctuary is still observed in the modern era, it no longer has any legal effect and is respected solely for the sake of tradition.
Political sanctuary:
Immunity to arrest afforded by a sovereign authority. The United Nations has expanded the definition of "political" to include race, nationality, religion, political opinions and membership or participation in any particular social group or social activities. People seeking political sanctuary typically do so by asking a sovereign authority for asylum.
Right of asylum:
Main article: Right of asylum
Many ancient peoples recognized a religious right of asylum, protecting criminals (or those accused of crime) from legal action and from exile to some extent. This principle was adopted by the early Christian church, and various rules developed for what the person had to do to qualify for protection and just how much protection it was.
In England, King Æthelberht made the first laws regulating sanctuary in about AD 600, though Geoffrey of Monmouth in his Historia Regum Britanniae (c. 1136) says that the legendary pre-Saxon king Dunvallo Molmutius (4th/5th century BC) enacted sanctuary laws in the Molmutine Laws as recorded by Gildas (c. 500–570).
By Norman times, there had come to be two kinds of sanctuary: All churches had the lower-level kind, but only the churches the king licensed had the broader version. The medieval system of asylum was finally abolished entirely in England by James I in 1623.
Political asylum:
During the Wars of the Roses of the 15th century when the Lancastrians or Yorkists would suddenly gain the upper hand by winning a battle, some adherents of the losing side might find themselves surrounded by adherents of the winning side and unable to return to their own side, so they would rush to sanctuary at the nearest church until it was safe to leave it. A prime example is Queen Elizabeth Woodville, consort of Edward IV of England.
In 1470, when the Lancastrians briefly restored Henry VI to the throne, Edward's queen was living in London with several young daughters. She moved with them into Westminster Abbey for sanctuary, living there in royal comfort until Edward was restored to the throne in 1471 and giving birth to their first son Edward during that time.
When King Edward IV died in 1483, Elizabeth (who was highly unpopular with even the Yorkists and probably did need protection) took her five daughters and youngest son (Richard, Duke of York; Prince Edward had his own household by then) and again moved into sanctuary at Westminster Abbey. She had all the comforts of home; she brought so much furniture and so many chests that the workmen had to break holes in some of the walls to move everything in fast enough to suit her.
In the 20th century, during World War I, all of Russia's Allies made the controversial decision in 1917 to deny political sanctuary to Tsar Nicholas II Romanov and his immediate family when he was overthrown in that year's February Revolution and forced to abdicate in March in favor of Alexander Kerensky's Russian Provisional Government.
Nicholas and his family and remaining household were sent to Tobolsk, Siberia that summer while Kerensky kept Russia in the war when it couldn't win, enabling Lenin and his Bolsheviks to gain the Russian people's support in overthrowing Kerensky in that year's October Revolution.
The Russian Civil War started that November and in July, 1918, with Lenin losing the civil war, Nicholas and his family were executed on Lenin's orders while confined to the Ipatiev House in Yekaterenburg. In 1939, months before World War II began, 937 Jewish refugees from Nazi Germany on board the MS St. Louis met the same fate, first by Cuba—their original destination—and afterwards by the United States and Canada. As a result, 620 of them were forced back to Europe, where 254 of them died in Nazi concentration camps during the war.
This incident was the subject of Gordon Thomas' and Max-Morgan Witts' 1974 novel, Voyage of the Damned and its 1976 movie adaptation. In 1970, Simonas Kudirka was denied U.S. sanctuary when he attempted to defect from the then-Soviet Union by jumping from his "mother ship", 'Sovetskaya Litva', onto the USCGC Vigilant when it was sailing from New Bedford while Kudirka's ship was anchored at Martha's Vineyard.
Kudrika was accused of stealing 3,000 rubles from Sovetskaya Litva's safe and when the U.S. State Department failed to help him, Kudrika was sent back to the Soviet Union, where he was convicted of treason and sentenced to ten years of hard labor but because Kudirka could claim American citizenship through his mother, he was allowed to return to the United States in 1974.
His plight was the subject of Algis Ruksenas' 1973 book Day of Shame: The Truth About The Murderous Happenings Aboard the Cutter Vigilant During the Russian-American Confrontation off Martha's Vineyard and the 1978 TV movie The Defection of Simas Kudirka, starring Alan Arkin.
In the 1980s, Ukrainian youth Walter Polovchak, became a cause célèbre because of his request in 1980 at age 12 to remain in the United States permanently after announcing that he didn't want to return with his parents to what was then Soviet Ukraine, and was the subject of a five-year struggle between U.S. and Soviet courts over his fate, which was decided in his favor in 1985 when Walter turned 18 that October 3 and was no longer a juvenile and thus no longer under any requirements to return to his parents if he didn't want to.
Later in the 1980s, Estonian national and alleged Nazi war criminal, Karl Linnas, was the target of several sanctuary denials outside the United States before he was finally returned in 1987 to the then-USSR to face a highly likely death penalty for alleged war crimes that he was convicted of in 1962 (see Holocaust trials in Soviet Estonia).
Linnas died of a heart attack in a Leningrad prison hospital on July 2, 1987 while waiting for a possible retrial in Gorbachevian courts, 25 years after Khrushchevian courts convicted him in absentia.
Sanctuary movement in modern times:
See also: Sanctuary movement
Sanctuary of refugees from Central American civil wars was a movement in the 1980s. Part of a broader anti-war movement positioned against U.S. foreign policy in Central America, by 1987, 440 cities in the United States had been declared "sanctuary cities" open to migrants from these civil wars in Central America.
These sites included university campuses and cities. From the 1980s continuing into the 2000s, there also have been instances of churches providing "sanctuary" for short periods to migrants facing deportation in Germany, France, Belgium, the Netherlands, Norway, Switzerland, Australia and Canada, among other nations.
In 2007, Iranian refugee Shahla Valadi was granted asylum in Norway after spending seven years in church sanctuary after the initial denial of asylum. From 1983 to 2003 Canada experienced 36 sanctuary incidents.
The "New Sanctuary Movement" organization estimates that at least 600,000 people in the United States have at least one family member in danger of deportation.
In 2016, an Icelandic church declared that they would harbor two failed asylum seekers who violated the Dublin Regulation, and police removed them for deportation, as ecclesiastical immunity has no legal standing.
Other uses:
When referring to a shelter from danger or hardship, sanctuary can mean one of the following:
Shelter sanctuary:
A place offering protection and safety; a shelter, typically used by displaced persons, refugees, and homeless people.
Other sanctuaries:
Humanitarian sanctuary:
A source of help, relief, or comfort in times of trouble typically used by victims of war and disaster.
Institutional sanctuary:
An institution for the care of people, especially those with physical or mental impairments, who require organized supervision or assistance.The term "sanctuary" has further come to be applied to any space set aside for private use in which others are not supposed to intrude, such as a "man cave".
Non-human sanctuary:
Animal sanctuary:
Main article: Animal sanctuary
An animal sanctuary is a facility where animals are brought to live and be protected for the rest of their lives. Unlike animal shelters, sanctuaries do not seek to place animals with individuals or groups, instead maintaining each animal until its natural death.
Plant sanctuary:
Main article: Wildlife preserve
Plant sanctuaries are areas set aside to maintain functioning natural ecosystems, to act as refuges for species and to maintain ecological processes that cannot survive in most intensely managed landscapes and seascapes. Protected areas act as benchmarks against which we understand human interactions with the natural world.
See also:
Religious sanctuary
Sanctuary is a word derived from the Latin sanctuarium, which is, like most words ending in -arium, a container for keeping something in—in this case holy things or perhaps cherished people (sanctae/sancti).
The meaning was extended to places of holiness or safety, in particular the whole demarcated area, often many acres, surrounding a Greek or Roman temple; the original terms for these are temenos in Greek and fanum in Latin, but both may be translated as sanctuary. Similar usage may be sometimes found describing sacred areas in other religions.
In Christian churches "sanctuary" has a specific meaning, covering part of the interior, covered below.
Sanctuary as area around the altar:
Main article: Chancel
In many Western Christian traditions including Catholic, Lutheran, Methodist, and Anglican churches, the area around the altar is called the sanctuary; it is also considered holy because of the physical presence of God in the Eucharist, both during the Mass and in the church tabernacle the rest of the time.
In many churches the architectural term chancel covers the same area as the sanctuary, and either term may be used. In some Protestant churches, the term sanctuary denotes the entire worship area while the term chancel is used to refer to the area around the altar-table.
In many Western traditions altar rails sometimes mark the edge of the sanctuary or chancel.
In the Eastern Orthodox Church, Eastern Catholic Churches of Syro-Malabar Church, Byzantine rite and Coptic Orthodox Churches, the sanctuary is separated from the nave (where the people pray) by an iconostasis, literally a wall of icons, with three doors in it. In other Oriental Orthodox traditions, a sanctuary curtain is used.
The terminology that applies the word "sanctuary" to the area around the altar does not apply to Christian churches alone: King Solomon's temple, built in about 950 BC, had a sanctuary ("Holy of Holies") where the Ark of the Covenant was, and the term applies to the corresponding part of any house of worship.
In most modern synagogues, the main room for prayer is known as the sanctuary, to contrast it with smaller rooms dedicated to various other services and functions. (There is a raised bimah in the sanctuary, from which services are conducted, which is where the ark holding the Torah may reside; some synagogues, however, have a separate bimah and ark-platform.)
Sanctuary as a sacred place:
In Europe, Christian churches were sometimes built on land considered to be a particularly holy spot, perhaps where a miracle or martyrdom was believed to have taken place or where a holy person was buried. Examples are St. Peter's Basilica in Rome and St. Albans Cathedral in England, which commemorate the martyrdom of Saint Peter (the first Pope) and Saint Alban (the first Christian martyr in Britain), respectively.
The place, and therefore the church built there, was considered to have been sanctified (made holy) by what happened there. In modern times, the Catholic Church has continued this practice by placing in the altar of each church, when it is consecrated for use, a box (the sepulcrum) containing relics of a saint.
The relics box is removed when the church is taken out of use as a church. In the Eastern Orthodox Church, the antimension on the altar serves a similar function. It is a cloth icon of Christ's body taken down from the cross, and typically has the relics of a saint sewn into it. In addition, it is signed by the parish's bishop, and represents his authorization and blessing for the Eucharist to be celebrated on that altar.
Human sanctuary:
Traditions of Sanctuary:
Although the word "sanctuary" is often traced back only as far as the Greek and Roman empires, the concept itself has likely been part of human cultures for thousands of years. The idea that persecuted persons should be given a place of refuge is ancient, perhaps even primordial, deriving itself from basic features of human altruism.
In studying the concept across many cultures and times, anthropologists have found sanctuary to be a highly universal notion, one which appears in almost all major religious traditions and in a variety of diverse geographies.
"Cities of refuge" as described by the Book of Numbers and Deuteronomy in the Old Testament, as well as the Bedouin idea of nazaala, or the "taking of refuge," indicate a strong tradition of sanctuary in the Middle East and Northern Africa. In the Americas, many native tribes shared similar practices, particularly in the face of invading European powers. Despite tensions between groups, many tribes still offered and received sanctuary, taking in those who had fled their tribal lands or feared persecution by the Spanish, English, and French.
Legal sanctuary:
In the classical world, some (but not all) temples offered sanctuary to criminals or runaway slaves. When referring to prosecution of crimes, sanctuary can mean one of the following:
Church sanctuary:
Main article: Place of Refuge § Medieval England
A sacred place, such as a church, in which fugitives formerly were immune to arrest (recognized by English law from the fourth to the seventeenth century). While the practice of churches offering sanctuary is still observed in the modern era, it no longer has any legal effect and is respected solely for the sake of tradition.
Political sanctuary:
Immunity to arrest afforded by a sovereign authority. The United Nations has expanded the definition of "political" to include race, nationality, religion, political opinions and membership or participation in any particular social group or social activities. People seeking political sanctuary typically do so by asking a sovereign authority for asylum.
Right of asylum:
Main article: Right of asylum
Many ancient peoples recognized a religious right of asylum, protecting criminals (or those accused of crime) from legal action and from exile to some extent. This principle was adopted by the early Christian church, and various rules developed for what the person had to do to qualify for protection and just how much protection it was.
In England, King Æthelberht made the first laws regulating sanctuary in about AD 600, though Geoffrey of Monmouth in his Historia Regum Britanniae (c. 1136) says that the legendary pre-Saxon king Dunvallo Molmutius (4th/5th century BC) enacted sanctuary laws in the Molmutine Laws as recorded by Gildas (c. 500–570).
By Norman times, there had come to be two kinds of sanctuary: All churches had the lower-level kind, but only the churches the king licensed had the broader version. The medieval system of asylum was finally abolished entirely in England by James I in 1623.
Political asylum:
During the Wars of the Roses of the 15th century when the Lancastrians or Yorkists would suddenly gain the upper hand by winning a battle, some adherents of the losing side might find themselves surrounded by adherents of the winning side and unable to return to their own side, so they would rush to sanctuary at the nearest church until it was safe to leave it. A prime example is Queen Elizabeth Woodville, consort of Edward IV of England.
In 1470, when the Lancastrians briefly restored Henry VI to the throne, Edward's queen was living in London with several young daughters. She moved with them into Westminster Abbey for sanctuary, living there in royal comfort until Edward was restored to the throne in 1471 and giving birth to their first son Edward during that time.
When King Edward IV died in 1483, Elizabeth (who was highly unpopular with even the Yorkists and probably did need protection) took her five daughters and youngest son (Richard, Duke of York; Prince Edward had his own household by then) and again moved into sanctuary at Westminster Abbey. She had all the comforts of home; she brought so much furniture and so many chests that the workmen had to break holes in some of the walls to move everything in fast enough to suit her.
In the 20th century, during World War I, all of Russia's Allies made the controversial decision in 1917 to deny political sanctuary to Tsar Nicholas II Romanov and his immediate family when he was overthrown in that year's February Revolution and forced to abdicate in March in favor of Alexander Kerensky's Russian Provisional Government.
Nicholas and his family and remaining household were sent to Tobolsk, Siberia that summer while Kerensky kept Russia in the war when it couldn't win, enabling Lenin and his Bolsheviks to gain the Russian people's support in overthrowing Kerensky in that year's October Revolution.
The Russian Civil War started that November and in July, 1918, with Lenin losing the civil war, Nicholas and his family were executed on Lenin's orders while confined to the Ipatiev House in Yekaterenburg. In 1939, months before World War II began, 937 Jewish refugees from Nazi Germany on board the MS St. Louis met the same fate, first by Cuba—their original destination—and afterwards by the United States and Canada. As a result, 620 of them were forced back to Europe, where 254 of them died in Nazi concentration camps during the war.
This incident was the subject of Gordon Thomas' and Max-Morgan Witts' 1974 novel, Voyage of the Damned and its 1976 movie adaptation. In 1970, Simonas Kudirka was denied U.S. sanctuary when he attempted to defect from the then-Soviet Union by jumping from his "mother ship", 'Sovetskaya Litva', onto the USCGC Vigilant when it was sailing from New Bedford while Kudirka's ship was anchored at Martha's Vineyard.
Kudrika was accused of stealing 3,000 rubles from Sovetskaya Litva's safe and when the U.S. State Department failed to help him, Kudrika was sent back to the Soviet Union, where he was convicted of treason and sentenced to ten years of hard labor but because Kudirka could claim American citizenship through his mother, he was allowed to return to the United States in 1974.
His plight was the subject of Algis Ruksenas' 1973 book Day of Shame: The Truth About The Murderous Happenings Aboard the Cutter Vigilant During the Russian-American Confrontation off Martha's Vineyard and the 1978 TV movie The Defection of Simas Kudirka, starring Alan Arkin.
In the 1980s, Ukrainian youth Walter Polovchak, became a cause célèbre because of his request in 1980 at age 12 to remain in the United States permanently after announcing that he didn't want to return with his parents to what was then Soviet Ukraine, and was the subject of a five-year struggle between U.S. and Soviet courts over his fate, which was decided in his favor in 1985 when Walter turned 18 that October 3 and was no longer a juvenile and thus no longer under any requirements to return to his parents if he didn't want to.
Later in the 1980s, Estonian national and alleged Nazi war criminal, Karl Linnas, was the target of several sanctuary denials outside the United States before he was finally returned in 1987 to the then-USSR to face a highly likely death penalty for alleged war crimes that he was convicted of in 1962 (see Holocaust trials in Soviet Estonia).
Linnas died of a heart attack in a Leningrad prison hospital on July 2, 1987 while waiting for a possible retrial in Gorbachevian courts, 25 years after Khrushchevian courts convicted him in absentia.
Sanctuary movement in modern times:
See also: Sanctuary movement
Sanctuary of refugees from Central American civil wars was a movement in the 1980s. Part of a broader anti-war movement positioned against U.S. foreign policy in Central America, by 1987, 440 cities in the United States had been declared "sanctuary cities" open to migrants from these civil wars in Central America.
These sites included university campuses and cities. From the 1980s continuing into the 2000s, there also have been instances of churches providing "sanctuary" for short periods to migrants facing deportation in Germany, France, Belgium, the Netherlands, Norway, Switzerland, Australia and Canada, among other nations.
In 2007, Iranian refugee Shahla Valadi was granted asylum in Norway after spending seven years in church sanctuary after the initial denial of asylum. From 1983 to 2003 Canada experienced 36 sanctuary incidents.
The "New Sanctuary Movement" organization estimates that at least 600,000 people in the United States have at least one family member in danger of deportation.
In 2016, an Icelandic church declared that they would harbor two failed asylum seekers who violated the Dublin Regulation, and police removed them for deportation, as ecclesiastical immunity has no legal standing.
Other uses:
When referring to a shelter from danger or hardship, sanctuary can mean one of the following:
Shelter sanctuary:
A place offering protection and safety; a shelter, typically used by displaced persons, refugees, and homeless people.
Other sanctuaries:
- safe house,
- right of asylum,
- air-raid shelter,
- emergency shelter,
- refugee camp,
- homeless shelter,
- humanitarian aid,
- relief agency,
- debt relief,
- psychiatric hospital,
- hospice,
- nursing home,
- and special education
Humanitarian sanctuary:
A source of help, relief, or comfort in times of trouble typically used by victims of war and disaster.
Institutional sanctuary:
An institution for the care of people, especially those with physical or mental impairments, who require organized supervision or assistance.The term "sanctuary" has further come to be applied to any space set aside for private use in which others are not supposed to intrude, such as a "man cave".
Non-human sanctuary:
Animal sanctuary:
Main article: Animal sanctuary
An animal sanctuary is a facility where animals are brought to live and be protected for the rest of their lives. Unlike animal shelters, sanctuaries do not seek to place animals with individuals or groups, instead maintaining each animal until its natural death.
Plant sanctuary:
Main article: Wildlife preserve
Plant sanctuaries are areas set aside to maintain functioning natural ecosystems, to act as refuges for species and to maintain ecological processes that cannot survive in most intensely managed landscapes and seascapes. Protected areas act as benchmarks against which we understand human interactions with the natural world.
See also:
- Asylum (antiquity)
- Cities of Refuge
- Elvira Arellano
- Frith
- Hospitality
- La Iglesia de Nuestra Señora la Reina de los Ángeles
- Shrine
- Sanctuary Movement history on New Standards
- Sanctuary in church architecture – from the Catholic Encyclopedia
- Sanctuary as a place of refuge – from the Catholic Encyclopedia
- Elephant Sanctuary - Rajasthan, India
Racial Bias on Criminal News in the United States Pictured below: See First YouTube above
Racial biases are a form of implicit bias, which refers to the attitudes or stereotypes that affect an individual's understanding, actions, and decisions in an unconscious manner. These biases, which encompass unfavorable assessments, are often activated involuntarily and without the awareness or intentional control of the individual.
Residing deep in the subconscious, these biases are different from known biases that individuals may choose to conceal for the purposes of social and/or political correctness.
Police officers come from all walks of life and they too have implicit bias, regardless of their ethnicity. Racial bias in criminal news reporting in the United States is a manifestation of this bias.
Racial bias in U.S. criminal news:
Racial bias has been recorded in criminal news reporting from the United States, particularly with regard to African American individuals, and a perceived fear of African Americans among European and White Americans.
Racial bias against African Americans:
Historical racism towards African Americans consists of beliefs about African American intelligence, ambition, honesty and other stereotyped characteristics, as well as support for segregation and support for acts of open discrimination.
Dana Mastro's research on racial bias in the United States reveals persistent racial prejudice among Caucasians, characterizing African Americans as violent and aggressive. These beliefs have been found to manifest in a heightened fear among Caucasians of victimization at the hands of racial minorities, specifically African American males.
Both theory and empirical evidence indicate that media exposure contributes to the construction and perpetuation of these perceptions by disproportionately depicting racial/ethnic minorities as criminal suspects and Caucasians as victims in television news.
Further consuming these messages has been shown to provoke prejudicial responses among Caucasian viewers.
Robert Entman suggests that today's media environment suggests that old-fashioned racial images are socially undesirable and stereotyping is now subtler and stereotyped thinking is reinforced at levels likely to remain below conscious awareness.
Rather than grossly demeaning distortions of yesterday's stereotyping now there is a grey area allowing for denial of the racial component. The phrase "threatening black male" allows for a negative attribute rather than an attack on racial identity.
Twitter, one of the more widely used forms of social media, with over 271 million active users, is the choice of the Millennial generation to get breaking news. Using hashtags, such as #michaelbrown, when they post allows for individuals to find information in a simpler manner.
The study conducted in the article Race and Punishment states that current crime coverage strategies aim to increase in the importance of a crime, thus distorting the public sense of who commits crimes, and leads to biased reactions. By over-representing Caucasians as victims of crimes perpetrated by people of color it exaggerates crimes committed by African Americans and downplays victimization of African Americans.
For example, the majority of US homicides are intra-racial, but media accounts often portray a world in which African American male offenders are overrepresented.
African American suspects presentation in news:
A study by the Sentencing Project reports that African American crime suspects were presented in more threatening contexts than Caucasians; to specify, African American suspects were more often left unnamed and were more likely to be shown as threatening by being depicted in physical custody of the police.
Analyses of television news consistently indicate that African American males are overrepresented as perpetrators and underrepresented as victims, compared to both their Caucasian male counterparts on TV as well as real-world Department of Justice arrest reports.
In these news stories, African American suspects are more likely than Caucasians to be portrayed as nameless, menacing, and in the grasp of the police. Some evidence also suggests that audiences know the news they watch misrepresents the reality of race and crime in the United States, and that news executives know their broadcasts scare their audiences.
Dana Mastro reports that African Americans are nearly four times more likely to be represented as criminals than police officers on television news—a proportion inconsistent with U.S. Department of Labor statistics. Alongside their over-representation as criminals in the news, African Americans also are underrepresented as victims compared with their on-air counterparts.
Another study by Dixon and Williams had also concluded that is still the case in cable news channels with one difference. African American homicide victims being may be more likely shown on cable networks than on television networks when national stories like the Trayvon Martin Fatal Shooting receive constant coverage on news cycles, thus receiving more airtime on cable owned twenty four hour run programming. This particular study also came to the conclusion that when studying this bias they would need to include a larger population of programs with more "polarizing on-air" personality, and across different parts of the country with different age groups.
Further, the text of crime-related news stories also has been found to vary depending on the race of the perpetrator. For example, Dixon and Linz's research reveals that statements containing prejudicial information about criminal suspects, such as prior arrests, were significantly more likely to be associated with African Americans as opposed to Caucasians defendants, particularly in cases involving Caucasian victims. Exposure to biased messages has consequences.
When the public consistently consumes the persistent over-representation of African American males in crime-related news stories it strengthens their cognitive association between Blacks and criminality in their mind such as the connection "Blacks and crime" and thus becomes chronically accessible for use in race-related evaluations such as: higher support for the death penalty because crime is more associated as a black problem that people need protection from and; that laziness is the only road black to success for people of color..
Notably, as the research on media priming illustrates, even a single exposure to these unfavorable characterizations can produce stereotype-based responses:
Residing deep in the subconscious, these biases are different from known biases that individuals may choose to conceal for the purposes of social and/or political correctness.
Police officers come from all walks of life and they too have implicit bias, regardless of their ethnicity. Racial bias in criminal news reporting in the United States is a manifestation of this bias.
Racial bias in U.S. criminal news:
Racial bias has been recorded in criminal news reporting from the United States, particularly with regard to African American individuals, and a perceived fear of African Americans among European and White Americans.
Racial bias against African Americans:
Historical racism towards African Americans consists of beliefs about African American intelligence, ambition, honesty and other stereotyped characteristics, as well as support for segregation and support for acts of open discrimination.
Dana Mastro's research on racial bias in the United States reveals persistent racial prejudice among Caucasians, characterizing African Americans as violent and aggressive. These beliefs have been found to manifest in a heightened fear among Caucasians of victimization at the hands of racial minorities, specifically African American males.
Both theory and empirical evidence indicate that media exposure contributes to the construction and perpetuation of these perceptions by disproportionately depicting racial/ethnic minorities as criminal suspects and Caucasians as victims in television news.
Further consuming these messages has been shown to provoke prejudicial responses among Caucasian viewers.
Robert Entman suggests that today's media environment suggests that old-fashioned racial images are socially undesirable and stereotyping is now subtler and stereotyped thinking is reinforced at levels likely to remain below conscious awareness.
Rather than grossly demeaning distortions of yesterday's stereotyping now there is a grey area allowing for denial of the racial component. The phrase "threatening black male" allows for a negative attribute rather than an attack on racial identity.
Twitter, one of the more widely used forms of social media, with over 271 million active users, is the choice of the Millennial generation to get breaking news. Using hashtags, such as #michaelbrown, when they post allows for individuals to find information in a simpler manner.
The study conducted in the article Race and Punishment states that current crime coverage strategies aim to increase in the importance of a crime, thus distorting the public sense of who commits crimes, and leads to biased reactions. By over-representing Caucasians as victims of crimes perpetrated by people of color it exaggerates crimes committed by African Americans and downplays victimization of African Americans.
For example, the majority of US homicides are intra-racial, but media accounts often portray a world in which African American male offenders are overrepresented.
African American suspects presentation in news:
A study by the Sentencing Project reports that African American crime suspects were presented in more threatening contexts than Caucasians; to specify, African American suspects were more often left unnamed and were more likely to be shown as threatening by being depicted in physical custody of the police.
Analyses of television news consistently indicate that African American males are overrepresented as perpetrators and underrepresented as victims, compared to both their Caucasian male counterparts on TV as well as real-world Department of Justice arrest reports.
In these news stories, African American suspects are more likely than Caucasians to be portrayed as nameless, menacing, and in the grasp of the police. Some evidence also suggests that audiences know the news they watch misrepresents the reality of race and crime in the United States, and that news executives know their broadcasts scare their audiences.
Dana Mastro reports that African Americans are nearly four times more likely to be represented as criminals than police officers on television news—a proportion inconsistent with U.S. Department of Labor statistics. Alongside their over-representation as criminals in the news, African Americans also are underrepresented as victims compared with their on-air counterparts.
Another study by Dixon and Williams had also concluded that is still the case in cable news channels with one difference. African American homicide victims being may be more likely shown on cable networks than on television networks when national stories like the Trayvon Martin Fatal Shooting receive constant coverage on news cycles, thus receiving more airtime on cable owned twenty four hour run programming. This particular study also came to the conclusion that when studying this bias they would need to include a larger population of programs with more "polarizing on-air" personality, and across different parts of the country with different age groups.
Further, the text of crime-related news stories also has been found to vary depending on the race of the perpetrator. For example, Dixon and Linz's research reveals that statements containing prejudicial information about criminal suspects, such as prior arrests, were significantly more likely to be associated with African Americans as opposed to Caucasians defendants, particularly in cases involving Caucasian victims. Exposure to biased messages has consequences.
When the public consistently consumes the persistent over-representation of African American males in crime-related news stories it strengthens their cognitive association between Blacks and criminality in their mind such as the connection "Blacks and crime" and thus becomes chronically accessible for use in race-related evaluations such as: higher support for the death penalty because crime is more associated as a black problem that people need protection from and; that laziness is the only road black to success for people of color..
Notably, as the research on media priming illustrates, even a single exposure to these unfavorable characterizations can produce stereotype-based responses:
Journalistic practices:
Studies conducted by The Sentencing Project found that journalists gravitated towards cases where Caucasians were the victims and cases where the assailant was African American.
Studies drew the conclusion that newsworthiness is not a product of how representative or novel a crime is, but rather how well it can be "scripted using stereotypes grounded in racism and fear of African American crime."
Robert Entman believes it is crucial to understand that journalists may not support modern racism. The news personnel shape reports in accordance with professional norms and conventions rather than their own perspective. Furthermore, journalistic practices yield to dialogue that fit audience stereotypes.
For example, select sound bites for a story about African American political activity will often choose those that convey trauma and conflict. Entman suggests that African American leaders produced ample supply of such quotes because the structure of social political power often maximizes them.
The following table done by Robert Entman in his article Blacks in the News: Television, Modern Racism and Cultural Change shows that 11% of the stories about African Americans accused of crimes compared with 29% of Caucasians allowed them or their defenders to present information in their own voices. This suggests that African Americans are treated less humanely and in a less individualistic way than Caucasians:
Studies conducted by The Sentencing Project found that journalists gravitated towards cases where Caucasians were the victims and cases where the assailant was African American.
Studies drew the conclusion that newsworthiness is not a product of how representative or novel a crime is, but rather how well it can be "scripted using stereotypes grounded in racism and fear of African American crime."
Robert Entman believes it is crucial to understand that journalists may not support modern racism. The news personnel shape reports in accordance with professional norms and conventions rather than their own perspective. Furthermore, journalistic practices yield to dialogue that fit audience stereotypes.
For example, select sound bites for a story about African American political activity will often choose those that convey trauma and conflict. Entman suggests that African American leaders produced ample supply of such quotes because the structure of social political power often maximizes them.
The following table done by Robert Entman in his article Blacks in the News: Television, Modern Racism and Cultural Change shows that 11% of the stories about African Americans accused of crimes compared with 29% of Caucasians allowed them or their defenders to present information in their own voices. This suggests that African Americans are treated less humanely and in a less individualistic way than Caucasians:
A smaller study delved a bit deeper into the tendency of journalistic racial bias by moving from data, to analyzing reporters.
An article written by Emily Drew In Critical Studies in Media Communication journal published in 2011 reviewed data from the interviews of 31 reporters from 28 major newspapers across the U.S. Each major newspaper had at one time published a series of articles covering race relations lasting from one month to a year. Through out the different series a common theme kept emerging, they questioned why “race relations appeared to be worsening – and racial disparities increasing”.
During the interviews, the journalists analyze why the media did not help in race relations when they held the power to do so “ just by choosing which stories to cover”. They started to investigate their own bias. They realized that two of the reasons bias most often happens in journalism is because privilege is not recognized and is due to a lack of diversity either on staff or in the city that they live and work.
The journalists interviewed also recognized that this bias does not end with white reporters.
One journalist was quoted, “Black folks are not immune to it either. If you get writing for a white newspaper for long enough, you start to write and even think in a white voice.”. Some short lived efforts of this study included programs aimed at hiring more people of color to staff these papers and for reporters to start becoming a part of the actual communities they were covering.
Media outlets and racial bias against African Americans:
Fox News:
Media Matters for America, a "progressive research and information center dedicated to comprehensively monitoring, analyzing, and correcting conservative misinformation in the U.S. media" is an outspoken critic of Fox News, frequently accusing the channel of including racial overtones in news coverage.
Furthermore, an MMFA article claims that a shooting of an Australian teen was labelled a racial hate crime by Fox News. MMFA was particularly outraged over an incident where Fox News' show On The Record With Greta Van Susteren, guest Pat Buchanan claimed that "racial hate crimes [are] 40 times more prevalent in the black community than the white community."
Media Matters for America also asserted that an edition of Fox & Friends, regarding the case of the Ferguson shootings, reporter Peter Doocy described the DOJ's finding of racial bias, emphasizing that Attorney General Eric Holder "floated the possibility" of dissolving the Ferguson police department as a result, while co-host Steve Doocy linked the DOJ report and Holder's response to the shooting of two police officers in Ferguson. Holder, at a press conference stated it was revenue rather than law enforcement that drove officers to target African-Americans in the community.
Doocy described the shooting, saying, "a new wave of violence comes one week after Attorney General Eric Holder vowed to dismantle that city's police department", and questioned whether it was "what he wanted."
ABC News:
ABC News has been seen to falter within the topic of journalism and to have a certain bias that has been painted by third parties that swayed their viewpoint. In the case of Mumia Abu-Jamal, a Philadelphia journalist and activist who was convicted and sentenced to death in 1981 for the murder of a police officer, ABC News formed a specific argument for their audience to see.
Tom Gardner, a professor at Westfield State University, decided to look deeper into this case and saw many valuations within the trial that needed to be reassessed. The Media Education Foundation took this case under their wings and decided to tell the story of this controversial case with Gardner and asked "important questions about the responsibility that journalists have when it comes to issues of life and death."
The documentary Framing an Execution: The Media & Mumia Abu-Jamal looks at the way Sam Donaldson from the ABC program 20/20 covered the case. Many scholars believe that Abu-Jamal is a political prisoner and is only in jail because of his specific views and criticisms of how police have dealt with the black community. This case only got recognition after people continued to dispute that Abu-Jamal's trial was fair or lawful, to the extent that it reached national and international attention. 20/20 told this case as an emotional story, minimalizing its importance.
Sam Donaldson began his interviews with the widow, Maureen Faulkner. She was portrayed as a damsel in distress, making her a more sympathetic figure. From the beginning the specific angle of ABC News and the direction of Sam Donaldson's bias could be seen. ABC stated in their letter sent to the Pennsylvania prison authorities when trying to get an interview with Abu-Jamal that they were "currently working in conjunction with Maureen Faulkner and the Philadelphia Fraternal Order of the Police."
Framing has a couple of meanings; to the filmmakers, the one that best described the behavior of 20/20 is "to falsely set someone up to look as though they are guilty." Because of the unfairness of the trial proceedings, many have argued that it is impossible for anyone to know if Abu-Jamal is guilty or not, but the way the media has framed his trial says otherwise.
Mike Farrell believes that it is important to look at "the political context, the tone of the time in Philadelphia, at that period before and after to understand the context of this trial." When Mike Farrell and Ed Asner were interviewed on 20/20 by Donaldson, he had to portray them as "know-nothing dupe celebrities" once they started to sound knowledgeable.
Donaldson believed that the trial wasn't unfair but that Mumia was unfair to the trial. He continues to put down Abu-Jamal and those standing up for him by negatively accusing them of taking on the behavior of a religious cult.
Thomas Gardner opined that the 20/20 program "was never really journalism to start with. It was an exercise in persuasion, in rhetoric, really unadulterated propaganda masquerading as journalism."
Amnesty International stated that "numerous aspects of this case clearly failed to meet minimum international standards safeguarding the fairness of legal proceedings", and "believes that the interest of the justice would best be served by the granting of a new trial of Mumia Abu Jamal."
Angela Davis, an activist, scholar, and author believes that the media purposely prevented people from understanding the case of Abu-Jamal, and that they wanted to keep the public unaware to make sure there would not be large numbers of people supporting his campaign.
Search engines and racial bias against African Americans:
Professor Latanya Sweeney from Harvard University identified "significant discrimination" in Google search terms that included names typically associated with black people, and were more likely to yield results relating to criminal activities, which according to Prof. Sweeney, may expose "racial bias in society".
Police bias:
See also: Institutional racism
The United States Department of Justice concluded that the police department of Ferguson, Missouri has been racially biased against African Americans by removing all variables other than race: the police have routinely violated the constitutional rights of African Americans in Ferguson due to the out of control violent crime problems that the community protects from policing, following a civil rights investigation investigating the shooting of Michael Brown by the department, which sparked protests and riots in the area.
Other reports indicate protests were held all around the United States as a result of Michael Browns death in Ferguson, Missouri. Police shootings have been studied by researchers in regards to whether ethnicity plays a role in an officers decision to use excessive force.
The Department of Justice has determined black suspects are more often killed by police officers than other races. Although three-quarters of the city's population is African American, the police department is almost entirely white. This city, like many other major cities has begun making changes in the past year to try to better its racial fairness.
See also:
An article written by Emily Drew In Critical Studies in Media Communication journal published in 2011 reviewed data from the interviews of 31 reporters from 28 major newspapers across the U.S. Each major newspaper had at one time published a series of articles covering race relations lasting from one month to a year. Through out the different series a common theme kept emerging, they questioned why “race relations appeared to be worsening – and racial disparities increasing”.
During the interviews, the journalists analyze why the media did not help in race relations when they held the power to do so “ just by choosing which stories to cover”. They started to investigate their own bias. They realized that two of the reasons bias most often happens in journalism is because privilege is not recognized and is due to a lack of diversity either on staff or in the city that they live and work.
The journalists interviewed also recognized that this bias does not end with white reporters.
One journalist was quoted, “Black folks are not immune to it either. If you get writing for a white newspaper for long enough, you start to write and even think in a white voice.”. Some short lived efforts of this study included programs aimed at hiring more people of color to staff these papers and for reporters to start becoming a part of the actual communities they were covering.
Media outlets and racial bias against African Americans:
Fox News:
Media Matters for America, a "progressive research and information center dedicated to comprehensively monitoring, analyzing, and correcting conservative misinformation in the U.S. media" is an outspoken critic of Fox News, frequently accusing the channel of including racial overtones in news coverage.
Furthermore, an MMFA article claims that a shooting of an Australian teen was labelled a racial hate crime by Fox News. MMFA was particularly outraged over an incident where Fox News' show On The Record With Greta Van Susteren, guest Pat Buchanan claimed that "racial hate crimes [are] 40 times more prevalent in the black community than the white community."
Media Matters for America also asserted that an edition of Fox & Friends, regarding the case of the Ferguson shootings, reporter Peter Doocy described the DOJ's finding of racial bias, emphasizing that Attorney General Eric Holder "floated the possibility" of dissolving the Ferguson police department as a result, while co-host Steve Doocy linked the DOJ report and Holder's response to the shooting of two police officers in Ferguson. Holder, at a press conference stated it was revenue rather than law enforcement that drove officers to target African-Americans in the community.
Doocy described the shooting, saying, "a new wave of violence comes one week after Attorney General Eric Holder vowed to dismantle that city's police department", and questioned whether it was "what he wanted."
ABC News:
ABC News has been seen to falter within the topic of journalism and to have a certain bias that has been painted by third parties that swayed their viewpoint. In the case of Mumia Abu-Jamal, a Philadelphia journalist and activist who was convicted and sentenced to death in 1981 for the murder of a police officer, ABC News formed a specific argument for their audience to see.
Tom Gardner, a professor at Westfield State University, decided to look deeper into this case and saw many valuations within the trial that needed to be reassessed. The Media Education Foundation took this case under their wings and decided to tell the story of this controversial case with Gardner and asked "important questions about the responsibility that journalists have when it comes to issues of life and death."
The documentary Framing an Execution: The Media & Mumia Abu-Jamal looks at the way Sam Donaldson from the ABC program 20/20 covered the case. Many scholars believe that Abu-Jamal is a political prisoner and is only in jail because of his specific views and criticisms of how police have dealt with the black community. This case only got recognition after people continued to dispute that Abu-Jamal's trial was fair or lawful, to the extent that it reached national and international attention. 20/20 told this case as an emotional story, minimalizing its importance.
Sam Donaldson began his interviews with the widow, Maureen Faulkner. She was portrayed as a damsel in distress, making her a more sympathetic figure. From the beginning the specific angle of ABC News and the direction of Sam Donaldson's bias could be seen. ABC stated in their letter sent to the Pennsylvania prison authorities when trying to get an interview with Abu-Jamal that they were "currently working in conjunction with Maureen Faulkner and the Philadelphia Fraternal Order of the Police."
Framing has a couple of meanings; to the filmmakers, the one that best described the behavior of 20/20 is "to falsely set someone up to look as though they are guilty." Because of the unfairness of the trial proceedings, many have argued that it is impossible for anyone to know if Abu-Jamal is guilty or not, but the way the media has framed his trial says otherwise.
Mike Farrell believes that it is important to look at "the political context, the tone of the time in Philadelphia, at that period before and after to understand the context of this trial." When Mike Farrell and Ed Asner were interviewed on 20/20 by Donaldson, he had to portray them as "know-nothing dupe celebrities" once they started to sound knowledgeable.
Donaldson believed that the trial wasn't unfair but that Mumia was unfair to the trial. He continues to put down Abu-Jamal and those standing up for him by negatively accusing them of taking on the behavior of a religious cult.
Thomas Gardner opined that the 20/20 program "was never really journalism to start with. It was an exercise in persuasion, in rhetoric, really unadulterated propaganda masquerading as journalism."
Amnesty International stated that "numerous aspects of this case clearly failed to meet minimum international standards safeguarding the fairness of legal proceedings", and "believes that the interest of the justice would best be served by the granting of a new trial of Mumia Abu Jamal."
Angela Davis, an activist, scholar, and author believes that the media purposely prevented people from understanding the case of Abu-Jamal, and that they wanted to keep the public unaware to make sure there would not be large numbers of people supporting his campaign.
Search engines and racial bias against African Americans:
Professor Latanya Sweeney from Harvard University identified "significant discrimination" in Google search terms that included names typically associated with black people, and were more likely to yield results relating to criminal activities, which according to Prof. Sweeney, may expose "racial bias in society".
Police bias:
See also: Institutional racism
The United States Department of Justice concluded that the police department of Ferguson, Missouri has been racially biased against African Americans by removing all variables other than race: the police have routinely violated the constitutional rights of African Americans in Ferguson due to the out of control violent crime problems that the community protects from policing, following a civil rights investigation investigating the shooting of Michael Brown by the department, which sparked protests and riots in the area.
Other reports indicate protests were held all around the United States as a result of Michael Browns death in Ferguson, Missouri. Police shootings have been studied by researchers in regards to whether ethnicity plays a role in an officers decision to use excessive force.
The Department of Justice has determined black suspects are more often killed by police officers than other races. Although three-quarters of the city's population is African American, the police department is almost entirely white. This city, like many other major cities has begun making changes in the past year to try to better its racial fairness.
See also:
- United States incarceration rate § Editorial policies of major media
- Media bias in the United States
- Race and crime in the United States
Wealth in the United States including a List of the Wealthiest Americans According to Forbes Magazine
Pictured (L-R): Bill Gates, Jeff Bezos, and Warren Buffett
- YouTube Video: Wealth Inequality in America by Bill Maher
- YouTube Video: 10 Countries Where Billionaires Hide Their Money
- YouTube Video: 14 billionaires join Buffett, Gates in giving pledge*
Pictured (L-R): Bill Gates, Jeff Bezos, and Warren Buffett
Wealth in the United States is commonly measured in terms of net worth, which is the sum of all assets, including the market value of real estate, like a home, minus all liabilities.
For example, a household in possession of an $800,000 house, $5,000 in mutual funds, $30,000 in cars, $20,000 worth of stock in their own company, and a $45,000 IRA would have assets totaling $900,000.
Assuming that this household would have a $250,000 mortgage, $40,000 in car loans, and $10,000 in credit card debt, its debts would total $300,000. Subtracting the debts from the worth of this household's assets (900,000 − $300,000 = $600,000), this household would have a net worth of $600,000. Net worth can vary with fluctuations in value of the underlying assets.
The wealth—more specifically, the median net worth—of households in the United States is varied with relation to race and education. As one would expect, households with greater income feature the highest net worths, though high income cannot be taken as an always accurate indicator of net worth.
Overall the number of wealthier households is on the rise, with baby boomers hitting the highs of their careers. In addition, wealth is unevenly distributed, with the wealthiest 25% of US households owning 87% of the wealth in the United States, which was $54.2 trillion in 2009.
U.S. household and non-profit organization net worth rose from $44.2 trillion in Q1 2000 to a pre-recession peak of $67.7 trillion in Q3 2007. It then fell $13.1 trillion to $54.6 trillion in Q1 2009 due to the subprime mortgage crisis. It then recovered, rising consistently to $86.8 trillion by Q4 2015. This is nearly double the 2000 level.
Including human capital such as skills, the United Nations estimated the total wealth of the United States in 2008 to be $118 trillion.
Income vs. Wealth:
While income is often seen as a type of wealth in colloquial language use, wealth and income are two substantially different measures of economic prosperity. Wealth is the total number of net possessions of an individual or household, while income is the total inflow of wealth over a given time period.
Hence the change in wealth over that time period is equal to the income minus the expenditures in that period. Income is a so called "flow" variable, while wealth is a so called "stock" variable.
Click on any of the following blue hyperlinks for more about "Wealth in the United States":
For example, a household in possession of an $800,000 house, $5,000 in mutual funds, $30,000 in cars, $20,000 worth of stock in their own company, and a $45,000 IRA would have assets totaling $900,000.
Assuming that this household would have a $250,000 mortgage, $40,000 in car loans, and $10,000 in credit card debt, its debts would total $300,000. Subtracting the debts from the worth of this household's assets (900,000 − $300,000 = $600,000), this household would have a net worth of $600,000. Net worth can vary with fluctuations in value of the underlying assets.
The wealth—more specifically, the median net worth—of households in the United States is varied with relation to race and education. As one would expect, households with greater income feature the highest net worths, though high income cannot be taken as an always accurate indicator of net worth.
Overall the number of wealthier households is on the rise, with baby boomers hitting the highs of their careers. In addition, wealth is unevenly distributed, with the wealthiest 25% of US households owning 87% of the wealth in the United States, which was $54.2 trillion in 2009.
U.S. household and non-profit organization net worth rose from $44.2 trillion in Q1 2000 to a pre-recession peak of $67.7 trillion in Q3 2007. It then fell $13.1 trillion to $54.6 trillion in Q1 2009 due to the subprime mortgage crisis. It then recovered, rising consistently to $86.8 trillion by Q4 2015. This is nearly double the 2000 level.
Including human capital such as skills, the United Nations estimated the total wealth of the United States in 2008 to be $118 trillion.
Income vs. Wealth:
While income is often seen as a type of wealth in colloquial language use, wealth and income are two substantially different measures of economic prosperity. Wealth is the total number of net possessions of an individual or household, while income is the total inflow of wealth over a given time period.
Hence the change in wealth over that time period is equal to the income minus the expenditures in that period. Income is a so called "flow" variable, while wealth is a so called "stock" variable.
Click on any of the following blue hyperlinks for more about "Wealth in the United States":
- Changes in wealth, 1989–2001
- Changes in wealth after 2007
- Mechanisms to gain wealth
- Wealth distribution
- See also:
Approaches to the Reduction of Prejudice
- YouTube Video: Prejudice: The Roots of Discrimination
- YouTube Video: Revealing the True Donald Trump: A Devastating Indictment of His Business & Life (2016)
- YouTube Video: Tucker Carlson: Last Week Tonight with John Oliver (HBO)
There is a great deal of research on the factors that lead to the formation of prejudiced attitudes and beliefs. There is also a lot of research on the consequences of holding prejudiced beliefs and being the target of such beliefs.
It is true that advances have been made in understanding the nature of prejudice. A consensus on how to end prejudice has yet to be established, but there are a number of scientifically examined strategies that have been developed in attempt to solve this social issue.
Intergroup interaction approaches:
Intergroup interaction approaches to prejudice reduction refer to strategies in which members of one group are put in situations where they have to interact with members of another group that they may hold prejudiced beliefs about. For example, if people from Group X are prejudiced towards people from Group Y or vice versa, an intergroup approach would require at least one person from Group X to interact with at least one person from Group Y.
The expectation is that prejudice will decrease following a specified type of interaction.
Intergroup approaches to prejudice reduction have been studied a great deal in laboratory settings, as well as outside of the laboratory, particularly in schools. Many intergroup prejudice reduction approaches are grounded in one of 3 main theoretical perspectives: interdependence, intergroup contact, and social identity.
Interdependence approaches:
Interdependence approaches to prejudice reduction are based on psychologist, Morton Deutsch's, theory of interdependence. According to this theory, when two groups realize that they have a common issue that can only be solved by pooling their resources together, they are more likely to engage in cooperative behaviors. Cooperation then results in friendliness during discussion and positive evaluations of the individuals from the opposite group.
Cooperative learning is an interdependence approach originally developed for the purpose of reducing racial prejudice in schools. It is most frequently examined in school settings, and studies testing this approach often occur across weeks.
This approach is most frequently associated with the “jigsaw method” created by social psychologist, Elliot Aronson. With this method, students are put into diverse teams of 5 or 6 people and assigned to complete a task. Each person is given a unique part of the total material necessary for solving the task.
Thus, in order to complete the task, team members have to work together, each sharing their unique information with the others. The jigsaw method has been shown to actually reduce prejudice toward members of the stigmatized group. A stigmatized group is a group that “has an attribute that marks them as different and leads them to be devalued in the eyes of others”.
The stigmatized group in the context of the jigsaw method is typically a racial minority group. Getting members of the non-stigmatized group to engage in cooperative behaviors with members of the stigmatized group results in increased liking, increased perspective taking, and increased helping behaviors between the different group members.
Another variation of cooperative learning is the competitive-cooperation method. With this method, the learning environment is set up such that students are assigned to diverse groups of 4-5 people and the diverse groups compete with each other in a weekly learning game tournament. Thus, group members are dependent on one another and cooperation is necessary in order for them to do well and outperform the other groups in the tournament.
The same outcomes of liking, perspective taking and helping behavior are expected with this type of cooperative learning strategy.
Overall, cooperative learning strategies have been quite effective in reducing prejudice.
However, as cooperative learning is generally studied with children in school settings, it is not clear what its impact is on adults. Also, there is little research on whether or not the reduction of prejudice that students experience as a result of cooperative learning extends to their perceptions of the stigmatized group as a whole or just to those members that are part of their assigned cooperative learning group.
Intergroup contact approaches:
Contact approaches to prejudice reduction are based on prominent social psychologist, Gordon Allport's, contact hypothesis. According to this hypothesis, prejudice is best reduced under optimal conditions of contact between those who hold prejudiced beliefs and those who are the targets of prejudiced beliefs.
The optimal conditions include equal status between groups in the context of the given situation, shared goals, authority support, and cooperation as opposed to competition. (This does overlap with the cooperative learning strategy discussed above.) Stuart Cook's “railroad studies” are classic examples of the contact hypothesis put into practice.
These railroad studies took place in the American South during the 1960s, an especially challenging time and place with respect to interracial relations. In these studies, racially prejudiced white adults were hired to perform a railroad management task with two coworkers under the guise that they were employed at a real part-time job.
Unbeknownst to them, the two coworkers – one White and one Black – were research assistants. After working with the two coworkers for over a month under optimal conditions, the initially prejudiced white participants rated their coworkers highly in attractiveness, likeability, and competence.
Moreover, several months later, participants still expressed lower prejudice than prejudiced whites that did not have the intergroup contact experience.
Social identity approaches:
According to social identity theory. people are biased to favor their ingroup – the group that they identify as belonging to – at the expense of the outgroup – the group that they do not identify with. Social identity-based approaches to prejudice reduction attempt to make a particular group-based identity, such as race or gender, less salient to individuals from different groups by emphasizing alternative ways of categorizing people.
One way of making a particular group-based identity less salient is through decategorization. Decategorization involves teaching people from different social groups to focus on a person's unique individual characteristics. This is known as individuation, and helps to draw attention away from group differences and toward individual differences. Decategorization often causes ingroup members to perceive fewer similarities among themselves.
Another way of making a particular group-based identity less salient is through recategorization. Here, individuals with different group-based identities are made aware of the fact that the groups to which they belong are part of an overarching group. The salience of their membership in the overarching group is emphasized over their exclusive group based identities.
For example, membership in the group “student” would be emphasized over membership in the group “humanities major” or “sciences major”.
Similar to recategorization, crossed categorization is when individuals from opposing groups are made aware of the fact that they both simultaneously belong to a third separate group, and membership in this third group is emphasized. For example, membership the group “military veteran” would be emphasized over membership in the group “humanities major” or “sciences major ”.
Integrative models acknowledge the coexistence of separate group-based identities within a common group identity. This is in alignment with multicultural ideologies that emphasize appreciation for racial and ethnic diversity while still emphasizing a common national identity.
Because divisive group membership is deemphasized in these categorization strategies, people from opposing groups express less ingroup favoritism. However, they do not necessarily show a reduction of bias against the outgroup. This approach has primarily been studied in laboratory settings and often with arbitrarily assigned group categories.[17] It is not entirely clear how these results translate when considering existing social groups in real-world settings.
Disclosure approaches:
Disclosure approaches rely on self-disclosure of personal information. Here, two individuals from different social groups would each reveal a piece of personal information about themselves. The act of disclosure signals vulnerability. This increases trust and liking, and that then results in a decrease of prejudiced beliefs. It is not clear if the decrease in prejudice extends beyond the disclosing individual to the social group to which that person belongs.
Individual approaches:
Individual approaches to prejudice reduction are not dependent on intergroup interaction. These approaches only require that an individual be exposed to some relevant information and/or engage in an activity intended to reduce prejudice. There are two main types of individual approaches to prejudice reduction: affective strategies that target what and how you feel, and cognitive strategies that target what and how you think.
A lot of the evidence on the effectiveness of affective and cognitive strategies is based on laboratory findings. As most of these studies consist of one-time sessions, it is unclear how long the positive effects of the strategies last. Also, there is not much knowledge about the extent to which these strategies are effective in situations outside of the laboratory.
Affective approaches:
Perspective taking. Taking the perspective of an individual from a stigmatized group has been shown to be effective in reducing prejudice because it evokes feelings of similarity and affinity toward the other person. Evidence from laboratory studies suggests that perspective taking specifically leads to a decrease in the use of stereotypes when categorizing or evaluating a member of a stigmatized group.
Empathy. Encouraging individuals to be empathetic towards stigmatized groups is another feeling-based strategy. Being instructed to be empathetic after reading about or watching videos of discrimination, against a stigmatized group, such as African Americans, results in decreased expressions of prejudice, and a stronger willingness to engage in contact with members of the stigmatized group.
Cognitive approaches:
Thought awareness and suppression. Increasing a person's awareness of his or her prejudiced thoughts and instructing that person to actively suppress those thoughts is a form of prejudice reduction that has been frequently studied in laboratory settings. However, suppression does not always reduce prejudice and sometimes has the opposite effect of increasing it.
Attitude reconditioning. There are several strategies that attempt to recondition or retrain implicit prejudiced attitudes – attitudes that exist outside of a person's conscious awareness.
One way of reconditioning implicit attitudes is through classical conditioning, whereby you pair a representation of a stigmatized group with positive images or positive words. While this is helpful in reducing implicit prejudice, it is not necessarily successful at changing conscious attitudes.
Another method of reconditioning is known as Situational Attribution Training. This training, based on the ultimate attribution error, reduces implicit prejudice by getting people to focus on situational explanations for negative behaviors displayed by members of stigmatized groups. Again, it is unclear if this leads to a decrease in conscious prejudiced attitudes.
Thought process reconditioning. Some research suggests that teaching people how to engage in more complex thinking elicits less biased evaluations of outgroup members. For example, instructing people on how to apply statistical reasoning to everyday judgments leads people to make more accurate assessments of outgroup members.
Experts and norms. When people are told that experts believe personality traits are changeable and learned, they decrease in their stereotyping of stigmatized groups. Also, stereotyping decreases when people are told that stereotyping of a particular stigmatized group is not the norm for their peers.
Accountability and value consistency. Some prejudice reduction strategies rely on creating a sense of internal conflict. One such strategy involves holding people accountable for their prejudice. Prejudice has been shown to decrease when people are asked to provide concrete reasons for prejudiced beliefs. The process of generating these reasons gets people to consider the irrational nature of their prejudiced beliefs.
Another strategy is to get people to view prejudice as being inconsistent with their behaviors or valued attitudes. This creates cognitive dissonance, and people attempt to resolve this tension by reducing expressions of prejudice. For example, after agreeing to write a public statement advocating a policy that is beneficial to racial minorities but costly to whites, whites report more personal support for this policy than before being asked to write the public statement.
Self-affirmation. People are also less likely to endorse prejudiced beliefs when their own self-worth is affirmed. After being made to feel good about themselves, people are more likely to positively rate job candidates from stigmatized groups and less likely to negatively stereotype people from stigmatized groups.
Integrated approaches:
Integrated approaches to prejudice reduction include both intergroup and individual components, such as vicarious intergroup contact, perspective taking, and empathy. Many of these integrated approaches involve some form of entertainment. After cooperative learning, entertainment-based interventions are the second most popular prejudice reduction strategy tested in non-laboratory settings.
Reading interventions are particularly popular:
Reading interventions. Reading interventions typically take place in schools and last an average of 5 weeks. They attempt to influence prejudiced beliefs through the use of engaging stories. Often these stories highlight positive interactions between children who are similar to those receiving the intervention and children who differ from them based on their membership in a stigmatized group.
Furthermore, when an emphasis is placed on individual characteristics as opposed to group membership, an experience of vicarious intergroup friendship occurs, and this leads to more positive attitudes toward children from stigmatized groups. There is little knowledge, however, of how such interventions influence children's behavior in actual intergroup interactions.
Prejudice reduction strategies not often studied:
Despite the fact that billions of dollars are spent on diversity training a year, workplace diversity training is not necessarily informed by prejudice reduction research, and its effectiveness in reducing prejudice has rarely been examined.
It is true that advances have been made in understanding the nature of prejudice. A consensus on how to end prejudice has yet to be established, but there are a number of scientifically examined strategies that have been developed in attempt to solve this social issue.
Intergroup interaction approaches:
Intergroup interaction approaches to prejudice reduction refer to strategies in which members of one group are put in situations where they have to interact with members of another group that they may hold prejudiced beliefs about. For example, if people from Group X are prejudiced towards people from Group Y or vice versa, an intergroup approach would require at least one person from Group X to interact with at least one person from Group Y.
The expectation is that prejudice will decrease following a specified type of interaction.
Intergroup approaches to prejudice reduction have been studied a great deal in laboratory settings, as well as outside of the laboratory, particularly in schools. Many intergroup prejudice reduction approaches are grounded in one of 3 main theoretical perspectives: interdependence, intergroup contact, and social identity.
Interdependence approaches:
Interdependence approaches to prejudice reduction are based on psychologist, Morton Deutsch's, theory of interdependence. According to this theory, when two groups realize that they have a common issue that can only be solved by pooling their resources together, they are more likely to engage in cooperative behaviors. Cooperation then results in friendliness during discussion and positive evaluations of the individuals from the opposite group.
Cooperative learning is an interdependence approach originally developed for the purpose of reducing racial prejudice in schools. It is most frequently examined in school settings, and studies testing this approach often occur across weeks.
This approach is most frequently associated with the “jigsaw method” created by social psychologist, Elliot Aronson. With this method, students are put into diverse teams of 5 or 6 people and assigned to complete a task. Each person is given a unique part of the total material necessary for solving the task.
Thus, in order to complete the task, team members have to work together, each sharing their unique information with the others. The jigsaw method has been shown to actually reduce prejudice toward members of the stigmatized group. A stigmatized group is a group that “has an attribute that marks them as different and leads them to be devalued in the eyes of others”.
The stigmatized group in the context of the jigsaw method is typically a racial minority group. Getting members of the non-stigmatized group to engage in cooperative behaviors with members of the stigmatized group results in increased liking, increased perspective taking, and increased helping behaviors between the different group members.
Another variation of cooperative learning is the competitive-cooperation method. With this method, the learning environment is set up such that students are assigned to diverse groups of 4-5 people and the diverse groups compete with each other in a weekly learning game tournament. Thus, group members are dependent on one another and cooperation is necessary in order for them to do well and outperform the other groups in the tournament.
The same outcomes of liking, perspective taking and helping behavior are expected with this type of cooperative learning strategy.
Overall, cooperative learning strategies have been quite effective in reducing prejudice.
However, as cooperative learning is generally studied with children in school settings, it is not clear what its impact is on adults. Also, there is little research on whether or not the reduction of prejudice that students experience as a result of cooperative learning extends to their perceptions of the stigmatized group as a whole or just to those members that are part of their assigned cooperative learning group.
Intergroup contact approaches:
Contact approaches to prejudice reduction are based on prominent social psychologist, Gordon Allport's, contact hypothesis. According to this hypothesis, prejudice is best reduced under optimal conditions of contact between those who hold prejudiced beliefs and those who are the targets of prejudiced beliefs.
The optimal conditions include equal status between groups in the context of the given situation, shared goals, authority support, and cooperation as opposed to competition. (This does overlap with the cooperative learning strategy discussed above.) Stuart Cook's “railroad studies” are classic examples of the contact hypothesis put into practice.
These railroad studies took place in the American South during the 1960s, an especially challenging time and place with respect to interracial relations. In these studies, racially prejudiced white adults were hired to perform a railroad management task with two coworkers under the guise that they were employed at a real part-time job.
Unbeknownst to them, the two coworkers – one White and one Black – were research assistants. After working with the two coworkers for over a month under optimal conditions, the initially prejudiced white participants rated their coworkers highly in attractiveness, likeability, and competence.
Moreover, several months later, participants still expressed lower prejudice than prejudiced whites that did not have the intergroup contact experience.
Social identity approaches:
According to social identity theory. people are biased to favor their ingroup – the group that they identify as belonging to – at the expense of the outgroup – the group that they do not identify with. Social identity-based approaches to prejudice reduction attempt to make a particular group-based identity, such as race or gender, less salient to individuals from different groups by emphasizing alternative ways of categorizing people.
One way of making a particular group-based identity less salient is through decategorization. Decategorization involves teaching people from different social groups to focus on a person's unique individual characteristics. This is known as individuation, and helps to draw attention away from group differences and toward individual differences. Decategorization often causes ingroup members to perceive fewer similarities among themselves.
Another way of making a particular group-based identity less salient is through recategorization. Here, individuals with different group-based identities are made aware of the fact that the groups to which they belong are part of an overarching group. The salience of their membership in the overarching group is emphasized over their exclusive group based identities.
For example, membership in the group “student” would be emphasized over membership in the group “humanities major” or “sciences major”.
Similar to recategorization, crossed categorization is when individuals from opposing groups are made aware of the fact that they both simultaneously belong to a third separate group, and membership in this third group is emphasized. For example, membership the group “military veteran” would be emphasized over membership in the group “humanities major” or “sciences major ”.
Integrative models acknowledge the coexistence of separate group-based identities within a common group identity. This is in alignment with multicultural ideologies that emphasize appreciation for racial and ethnic diversity while still emphasizing a common national identity.
Because divisive group membership is deemphasized in these categorization strategies, people from opposing groups express less ingroup favoritism. However, they do not necessarily show a reduction of bias against the outgroup. This approach has primarily been studied in laboratory settings and often with arbitrarily assigned group categories.[17] It is not entirely clear how these results translate when considering existing social groups in real-world settings.
Disclosure approaches:
Disclosure approaches rely on self-disclosure of personal information. Here, two individuals from different social groups would each reveal a piece of personal information about themselves. The act of disclosure signals vulnerability. This increases trust and liking, and that then results in a decrease of prejudiced beliefs. It is not clear if the decrease in prejudice extends beyond the disclosing individual to the social group to which that person belongs.
Individual approaches:
Individual approaches to prejudice reduction are not dependent on intergroup interaction. These approaches only require that an individual be exposed to some relevant information and/or engage in an activity intended to reduce prejudice. There are two main types of individual approaches to prejudice reduction: affective strategies that target what and how you feel, and cognitive strategies that target what and how you think.
A lot of the evidence on the effectiveness of affective and cognitive strategies is based on laboratory findings. As most of these studies consist of one-time sessions, it is unclear how long the positive effects of the strategies last. Also, there is not much knowledge about the extent to which these strategies are effective in situations outside of the laboratory.
Affective approaches:
Perspective taking. Taking the perspective of an individual from a stigmatized group has been shown to be effective in reducing prejudice because it evokes feelings of similarity and affinity toward the other person. Evidence from laboratory studies suggests that perspective taking specifically leads to a decrease in the use of stereotypes when categorizing or evaluating a member of a stigmatized group.
Empathy. Encouraging individuals to be empathetic towards stigmatized groups is another feeling-based strategy. Being instructed to be empathetic after reading about or watching videos of discrimination, against a stigmatized group, such as African Americans, results in decreased expressions of prejudice, and a stronger willingness to engage in contact with members of the stigmatized group.
Cognitive approaches:
Thought awareness and suppression. Increasing a person's awareness of his or her prejudiced thoughts and instructing that person to actively suppress those thoughts is a form of prejudice reduction that has been frequently studied in laboratory settings. However, suppression does not always reduce prejudice and sometimes has the opposite effect of increasing it.
Attitude reconditioning. There are several strategies that attempt to recondition or retrain implicit prejudiced attitudes – attitudes that exist outside of a person's conscious awareness.
One way of reconditioning implicit attitudes is through classical conditioning, whereby you pair a representation of a stigmatized group with positive images or positive words. While this is helpful in reducing implicit prejudice, it is not necessarily successful at changing conscious attitudes.
Another method of reconditioning is known as Situational Attribution Training. This training, based on the ultimate attribution error, reduces implicit prejudice by getting people to focus on situational explanations for negative behaviors displayed by members of stigmatized groups. Again, it is unclear if this leads to a decrease in conscious prejudiced attitudes.
Thought process reconditioning. Some research suggests that teaching people how to engage in more complex thinking elicits less biased evaluations of outgroup members. For example, instructing people on how to apply statistical reasoning to everyday judgments leads people to make more accurate assessments of outgroup members.
Experts and norms. When people are told that experts believe personality traits are changeable and learned, they decrease in their stereotyping of stigmatized groups. Also, stereotyping decreases when people are told that stereotyping of a particular stigmatized group is not the norm for their peers.
Accountability and value consistency. Some prejudice reduction strategies rely on creating a sense of internal conflict. One such strategy involves holding people accountable for their prejudice. Prejudice has been shown to decrease when people are asked to provide concrete reasons for prejudiced beliefs. The process of generating these reasons gets people to consider the irrational nature of their prejudiced beliefs.
Another strategy is to get people to view prejudice as being inconsistent with their behaviors or valued attitudes. This creates cognitive dissonance, and people attempt to resolve this tension by reducing expressions of prejudice. For example, after agreeing to write a public statement advocating a policy that is beneficial to racial minorities but costly to whites, whites report more personal support for this policy than before being asked to write the public statement.
Self-affirmation. People are also less likely to endorse prejudiced beliefs when their own self-worth is affirmed. After being made to feel good about themselves, people are more likely to positively rate job candidates from stigmatized groups and less likely to negatively stereotype people from stigmatized groups.
Integrated approaches:
Integrated approaches to prejudice reduction include both intergroup and individual components, such as vicarious intergroup contact, perspective taking, and empathy. Many of these integrated approaches involve some form of entertainment. After cooperative learning, entertainment-based interventions are the second most popular prejudice reduction strategy tested in non-laboratory settings.
Reading interventions are particularly popular:
Reading interventions. Reading interventions typically take place in schools and last an average of 5 weeks. They attempt to influence prejudiced beliefs through the use of engaging stories. Often these stories highlight positive interactions between children who are similar to those receiving the intervention and children who differ from them based on their membership in a stigmatized group.
Furthermore, when an emphasis is placed on individual characteristics as opposed to group membership, an experience of vicarious intergroup friendship occurs, and this leads to more positive attitudes toward children from stigmatized groups. There is little knowledge, however, of how such interventions influence children's behavior in actual intergroup interactions.
Prejudice reduction strategies not often studied:
Despite the fact that billions of dollars are spent on diversity training a year, workplace diversity training is not necessarily informed by prejudice reduction research, and its effectiveness in reducing prejudice has rarely been examined.
Civil Liberties: International and Specific to The United States
Liberties are guarantees and freedoms that governments commit not to abridge, either by constitution, legislation, or judicial interpretation, without due process.
Though the scope of the term differs between countries, civil liberties may include:
Within the distinctions between civil liberties and other types of liberty, distinctions exist between positive liberty/positive rights and negative liberty/negative rights.
Overview:
Many contemporary nations have a constitution, a bill of rights, or similar constitutional documents that enumerate and seek to guarantee civil liberties. Other nations have enacted similar laws through a variety of legal means, including signing and ratifying or otherwise giving effect to key conventions such as the European Convention on Human Rights and the International Covenant on Civil and Political Rights.
The existence of some claimed civil liberties is a matter of dispute, as are the extent of most civil rights. Controversial examples include property rights, reproductive rights, and civil marriage.
In authoritarian regimes in which government censorship impedes on perceived civil liberties, some civil liberty advocates argue for the use of anonymity tools to allow for free speech, privacy, and anonymity.
The degree that democracies have involved themselves in needs to take into fact the influence of terrorism. Whether the existence of victimless crimes infringes upon civil liberties is a matter of dispute.
Another matter of debate is the suspension or alteration of certain civil liberties in times of war or state of emergency, including whether and to what extent this should occur.
The formal concept of civil liberties is often dated back to Magna Carta, an English legal charter agreed in 1215 which in turn was based on pre-existing documents, namely the Charter of Liberties.
Click on any of the following blue hyperlinks for more about International Civil Liberties:
Though the scope of the term differs between countries, civil liberties may include:
- the freedom of conscience,
- freedom of press,
- freedom of religion,
- freedom of expression,
- freedom of assembly,
- the right to security and liberty,
- freedom of speech,
- the right to privacy,
- the right to equal treatment under the law and due process,
- the right to a fair trial, and the right to life.
- Other civil liberties include
- the right to own property,
- the right to defend oneself,
- and the right to bodily integrity.
Within the distinctions between civil liberties and other types of liberty, distinctions exist between positive liberty/positive rights and negative liberty/negative rights.
Overview:
Many contemporary nations have a constitution, a bill of rights, or similar constitutional documents that enumerate and seek to guarantee civil liberties. Other nations have enacted similar laws through a variety of legal means, including signing and ratifying or otherwise giving effect to key conventions such as the European Convention on Human Rights and the International Covenant on Civil and Political Rights.
The existence of some claimed civil liberties is a matter of dispute, as are the extent of most civil rights. Controversial examples include property rights, reproductive rights, and civil marriage.
In authoritarian regimes in which government censorship impedes on perceived civil liberties, some civil liberty advocates argue for the use of anonymity tools to allow for free speech, privacy, and anonymity.
The degree that democracies have involved themselves in needs to take into fact the influence of terrorism. Whether the existence of victimless crimes infringes upon civil liberties is a matter of dispute.
Another matter of debate is the suspension or alteration of certain civil liberties in times of war or state of emergency, including whether and to what extent this should occur.
The formal concept of civil liberties is often dated back to Magna Carta, an English legal charter agreed in 1215 which in turn was based on pre-existing documents, namely the Charter of Liberties.
Click on any of the following blue hyperlinks for more about International Civil Liberties:
- Asia
- Australia
- Europe
- North America
- Canada
- See next topic for the United States
- See Also:
- American Civil Liberties Union
- Canadian Civil Liberties Association
- Civil and political rights
- Civil libertarianism
- Drug liberalization
- Equality and Human Rights Commission
- Fundamental freedoms
- Human rights
- Libertarianism
- Liberalism
- Liberty (pressure group)
- List of civil rights leaders
- Privacy
- Proactive policing
- Rule according to higher law
- Rutherford Institute
- Teaching for social justice
- State of World Liberty Index
- Statewatch
- Court cases involving Civil Liberties held at the National Archives at Atlanta
- Leading Civil Liberties Organizations in the United States
- Rights, Civil Liberties and Freedoms in Russia
- The Cato Institute: Civil Liberties
- USDOJ: Privacy and Civil Liberties Office
Civil liberties in the United States
Civil liberties in the United States are certain unalienable rights retained by (as opposed to privileges granted to) citizens of the United States under the Constitution of the United States, as interpreted and clarified by the Supreme Court of the United States and lower federal courts.
Civil liberties (see above) are simply defined as individual legal and constitutional protections from entities more powerful than an individual, for example, parts of the government, other individuals, or corporations. The explicitly defined liberties make up the Bill of Rights, including freedom of speech, the right to bear arms, and the right to privacy.
There are also many liberties of people not defined in the Constitution, as stated in the Ninth Amendment: The enumeration in the Constitution, of certain rights, shall not be construed to deny or disparage others retained by the people.
The extent of civil liberties and the percentage of the population of the United States who had access to these liberties has expanded over time. For example, the Constitution did not originally define who was eligible to vote, allowing each state to determine who was eligible.
In the early history of the U.S., most states allowed only white male adult property owners to vote (about 6% of the population). The 'Three-Fifths Compromise' allowed the southern slaveholders to consolidate power and maintain slavery in America for eighty years after the ratification of the Constitution. And the Bill of Rights had little impact on judgements by the courts for the first 130 years after ratification.
United States Constitution:
Freedom of religion:
Main articles: The text of Amendment I to the United States Constitution, ratified December 15, 1791, states that:
"Congress shall make no law... prohibiting the free exercise thereof;"
— United States Constitution, Amendment I
Freedom of expression:
Main article: Freedom of speech in the United States
Free Speech Clause:
Main article: Free Speech Clause
The text of Amendment I to the United States Constitution, ratified December 15, 1791, states that:
"Congress shall make no law... abridging the freedom of speech,"
— United States Constitution, Amendment I
Free Press Clause:
The text of Amendment I to the United States Constitution, ratified December 15, 1791, states that:
"Congress shall make no law... abridging... the press,"
— United States Constitution, Amendment I
Free Assembly Clause:
The text of Amendment I to the United States Constitution, ratified December 15, 1791, states that:
"Congress shall make no law... abridging... the right of the people peaceably to assemble,"
— United States Constitution, Amendment I
Petition Clause:
Main article: Petition Clause
The text of Amendment I to the United States Constitution, ratified December 15, 1791, states that:
"Congress shall make no law... abridging... the right of the people... to petition the Government for a redress of grievances."
— United States Constitution, Amendment I
Free speech exceptions:
Main article: United States free speech exceptions
The following types of speech are not protected constitutionally:
Because these categories fall outside of the First Amendment privileges, the courts can legally restrict or criminalize any expressive act within them. Other expressions, including threat of bodily harm or publicizing illegal activity, may also be ruled illegal.
Right to keep and bear arms:
Main article: Right to keep and bear arms in the United States
The text of Amendment II to the United States Constitution, ratified December 15, 1791, states that:
"A well regulated Militia, being necessary to the security of a free state, the right of the people to keep and bear Arms, shall not be infringed."
— United States Constitution, Amendment II
Sexual freedom:
The concept of sexual freedom includes a broad range of different rights that are not mentioned in the U.S. Constitution. The idea of sexual freedom has sprung more from the popular opinion of society in more recent years, and has had very little Constitutional backing.
The following liberties are included under sexual freedom:
Sexual freedom in general is considered an implied procedure, and is not mentioned in the Constitution.
Sexual freedoms include the freedom to have consensual sex with whomever a person chooses, at any time, for any reason, provided the person is of the age of majority. Marriage is not required, nor are there any requirements as to the gender or number of people you have sex with. Sexual freedom includes the freedom to have private consensual homosexual sex (Lawrence v. Texas).
Equal protection:
Main article: Equal Protection Clause
Equal protection prevents the government from creating laws that are discriminatory in application or effect.
Right to vote:
The text of Amendment XIV to the United States Constitution, ratified July 9, 1868, states that:
"when the right to vote at any election for the choice of electors for President and Vice President of the United States, Representatives in Congress, the Executive and Judicial officers of a State, or the members of the Legislature thereof, is denied to any of the male inhabitants of such State, being twenty-one (eighteen) years of age, and citizens of the United States, or in any way abridged, except for participation in rebellion, or other crime, the basis of representation therein shall be reduced in the proportion which the number of such male citizens shall bear to the whole number of male citizens twenty-one (eighteen) years of age in such State."
— United States Constitution, Article XIV
The text of Amendment XV to the United States Constitution, ratified February 3, 1870, states that:
"The right of citizens of the United States to vote shall not be denied or abridged by the United States or by any State on account of race, color, or previous condition of servitude."
— United States Constitution, Article XV
The text of Amendment XIX to the United States Constitution, ratified August 18, 1919, states that:
"The right of citizens of the United States to vote shall not be denied or abridged by the United States or by any State on account of sex.
— United States Constitution, Amendment XIX
The text of Amendment XXIII to the United States Constitution, ratified January 23, 1964, states that:
"The right of citizens of the United States to vote in any primary or other election for President or Vice President, for electors for President or Vice President, or for Senator or Representative in Congress, shall not be denied or abridged by the United States or any state by reason of failure to pay any poll tax or other tax."
— United States Constitution, Amendment XXIII
The text of Amendment XXVI to the United States Constitution, ratified July 1, 1971, states that:
"The right of citizens of the United States, who are 18 years of age or older, to vote, shall not be denied or abridged by the United States or any state on account of age."
— United States Constitution, Amendment XXVI
Right to privacy:
This section is an excerpt from Right to privacy § United States.
The Constitution of the United States and United States Bill of Rights do not explicitly include a right to privacy. Currently no federal law takes a holistic approach to privacy regulation.
In the US, Privacy and associated rights have been determined via court cases and the protections have been established through Laws.
The Supreme Court in Griswold v. Connecticut, 381 U.S. 479 (1965) found that the Constitution guarantees a right to privacy against governmental intrusion via penumbras located in the founding text.
1890 Warren and Brandeis drafted an article published in the Harvard Law Review titled "The Right To Privacy" is often cited as the first implicit finding of a U.S. stance on the right to privacy.
Right to privacy has been the justification for decisions involving a wide range of civil liberties cases, including Pierce v. Society of Sisters, which invalidated a successful 1922 Oregon initiative requiring compulsory public education, Roe v. Wade, which struck down an abortion law from Texas, and thus restricted state powers to enforce laws against abortion, and Lawrence v. Texas, which struck down a Texas sodomy law, and thus eliminated state powers to enforce laws against sodomy.
Legally the right of privacy is a basic law which includes:
In 2018 the first state in the United States, California, originally set out to create a policy promoting data protection. The resulting effort is the California Consumer Privacy Act (CCPA) reviewed as a critical juncture where the legal definition of what privacy entails from California lawmakers perspective.
The California Consumer Protection Act is a privacy law protecting the residents of California and their Personal identifying information. The law enacts regulation over all companies regardless of operational geography protecting the six Intentional Acts included in the law.
Right to marriage:
In the 1967 United States Supreme Court ruling in the case of Loving v. Virginia found a fundamental right to marriage, regardless of race. In the 2015 United States Supreme Court ruling in the case of Obergefell v. Hodges found a fundamental right to marriage, regardless of gender.
See also:
Civil liberties in the United States are certain unalienable rights retained by (as opposed to privileges granted to) citizens of the United States under the Constitution of the United States, as interpreted and clarified by the Supreme Court of the United States and lower federal courts.
Civil liberties (see above) are simply defined as individual legal and constitutional protections from entities more powerful than an individual, for example, parts of the government, other individuals, or corporations. The explicitly defined liberties make up the Bill of Rights, including freedom of speech, the right to bear arms, and the right to privacy.
There are also many liberties of people not defined in the Constitution, as stated in the Ninth Amendment: The enumeration in the Constitution, of certain rights, shall not be construed to deny or disparage others retained by the people.
The extent of civil liberties and the percentage of the population of the United States who had access to these liberties has expanded over time. For example, the Constitution did not originally define who was eligible to vote, allowing each state to determine who was eligible.
In the early history of the U.S., most states allowed only white male adult property owners to vote (about 6% of the population). The 'Three-Fifths Compromise' allowed the southern slaveholders to consolidate power and maintain slavery in America for eighty years after the ratification of the Constitution. And the Bill of Rights had little impact on judgements by the courts for the first 130 years after ratification.
United States Constitution:
Freedom of religion:
Main articles: The text of Amendment I to the United States Constitution, ratified December 15, 1791, states that:
"Congress shall make no law... prohibiting the free exercise thereof;"
— United States Constitution, Amendment I
Freedom of expression:
Main article: Freedom of speech in the United States
Free Speech Clause:
Main article: Free Speech Clause
The text of Amendment I to the United States Constitution, ratified December 15, 1791, states that:
"Congress shall make no law... abridging the freedom of speech,"
— United States Constitution, Amendment I
Free Press Clause:
The text of Amendment I to the United States Constitution, ratified December 15, 1791, states that:
"Congress shall make no law... abridging... the press,"
— United States Constitution, Amendment I
Free Assembly Clause:
The text of Amendment I to the United States Constitution, ratified December 15, 1791, states that:
"Congress shall make no law... abridging... the right of the people peaceably to assemble,"
— United States Constitution, Amendment I
Petition Clause:
Main article: Petition Clause
The text of Amendment I to the United States Constitution, ratified December 15, 1791, states that:
"Congress shall make no law... abridging... the right of the people... to petition the Government for a redress of grievances."
— United States Constitution, Amendment I
Free speech exceptions:
Main article: United States free speech exceptions
The following types of speech are not protected constitutionally:
- defamation or false statements,
- child pornography,
- obscenity,
- damaging the national security interests,
- verbal acts,
- and fighting words.
Because these categories fall outside of the First Amendment privileges, the courts can legally restrict or criminalize any expressive act within them. Other expressions, including threat of bodily harm or publicizing illegal activity, may also be ruled illegal.
Right to keep and bear arms:
Main article: Right to keep and bear arms in the United States
The text of Amendment II to the United States Constitution, ratified December 15, 1791, states that:
"A well regulated Militia, being necessary to the security of a free state, the right of the people to keep and bear Arms, shall not be infringed."
— United States Constitution, Amendment II
Sexual freedom:
The concept of sexual freedom includes a broad range of different rights that are not mentioned in the U.S. Constitution. The idea of sexual freedom has sprung more from the popular opinion of society in more recent years, and has had very little Constitutional backing.
The following liberties are included under sexual freedom:
- sexual expression,
- sexual choices,
- sexual education,
- reproductive justice,
- and sexual health.
Sexual freedom in general is considered an implied procedure, and is not mentioned in the Constitution.
Sexual freedoms include the freedom to have consensual sex with whomever a person chooses, at any time, for any reason, provided the person is of the age of majority. Marriage is not required, nor are there any requirements as to the gender or number of people you have sex with. Sexual freedom includes the freedom to have private consensual homosexual sex (Lawrence v. Texas).
Equal protection:
Main article: Equal Protection Clause
Equal protection prevents the government from creating laws that are discriminatory in application or effect.
Right to vote:
The text of Amendment XIV to the United States Constitution, ratified July 9, 1868, states that:
"when the right to vote at any election for the choice of electors for President and Vice President of the United States, Representatives in Congress, the Executive and Judicial officers of a State, or the members of the Legislature thereof, is denied to any of the male inhabitants of such State, being twenty-one (eighteen) years of age, and citizens of the United States, or in any way abridged, except for participation in rebellion, or other crime, the basis of representation therein shall be reduced in the proportion which the number of such male citizens shall bear to the whole number of male citizens twenty-one (eighteen) years of age in such State."
— United States Constitution, Article XIV
The text of Amendment XV to the United States Constitution, ratified February 3, 1870, states that:
"The right of citizens of the United States to vote shall not be denied or abridged by the United States or by any State on account of race, color, or previous condition of servitude."
— United States Constitution, Article XV
The text of Amendment XIX to the United States Constitution, ratified August 18, 1919, states that:
"The right of citizens of the United States to vote shall not be denied or abridged by the United States or by any State on account of sex.
— United States Constitution, Amendment XIX
The text of Amendment XXIII to the United States Constitution, ratified January 23, 1964, states that:
"The right of citizens of the United States to vote in any primary or other election for President or Vice President, for electors for President or Vice President, or for Senator or Representative in Congress, shall not be denied or abridged by the United States or any state by reason of failure to pay any poll tax or other tax."
— United States Constitution, Amendment XXIII
The text of Amendment XXVI to the United States Constitution, ratified July 1, 1971, states that:
"The right of citizens of the United States, who are 18 years of age or older, to vote, shall not be denied or abridged by the United States or any state on account of age."
— United States Constitution, Amendment XXVI
Right to privacy:
This section is an excerpt from Right to privacy § United States.
The Constitution of the United States and United States Bill of Rights do not explicitly include a right to privacy. Currently no federal law takes a holistic approach to privacy regulation.
In the US, Privacy and associated rights have been determined via court cases and the protections have been established through Laws.
The Supreme Court in Griswold v. Connecticut, 381 U.S. 479 (1965) found that the Constitution guarantees a right to privacy against governmental intrusion via penumbras located in the founding text.
1890 Warren and Brandeis drafted an article published in the Harvard Law Review titled "The Right To Privacy" is often cited as the first implicit finding of a U.S. stance on the right to privacy.
Right to privacy has been the justification for decisions involving a wide range of civil liberties cases, including Pierce v. Society of Sisters, which invalidated a successful 1922 Oregon initiative requiring compulsory public education, Roe v. Wade, which struck down an abortion law from Texas, and thus restricted state powers to enforce laws against abortion, and Lawrence v. Texas, which struck down a Texas sodomy law, and thus eliminated state powers to enforce laws against sodomy.
Legally the right of privacy is a basic law which includes:
- The right of persons to be free from unwarranted publicity
- Unwarranted appropriation of one's personality
- Publicizing one's private affairs without a legitimate public concern
- Wrongful intrusion into one's private activities
In 2018 the first state in the United States, California, originally set out to create a policy promoting data protection. The resulting effort is the California Consumer Privacy Act (CCPA) reviewed as a critical juncture where the legal definition of what privacy entails from California lawmakers perspective.
The California Consumer Protection Act is a privacy law protecting the residents of California and their Personal identifying information. The law enacts regulation over all companies regardless of operational geography protecting the six Intentional Acts included in the law.
Right to marriage:
In the 1967 United States Supreme Court ruling in the case of Loving v. Virginia found a fundamental right to marriage, regardless of race. In the 2015 United States Supreme Court ruling in the case of Obergefell v. Hodges found a fundamental right to marriage, regardless of gender.
See also:
- American Civil Liberties Union
- Civil liberties in the United Kingdom
- Civil rights in the United States
- Constitution of the United States
critical race theory (Oxford Dictionary)
noun
a set of ideas holding that racial bias is inherent in many parts of western society, especially in its legal and social institutions, on the basis of their having been primarily designed for and implemented by white people.
"I took a class in law school that examined case law through the lens of critical race theory"
___________________________________________________________________________
Critical Race Theory (Wikipedia):
Critical race theory (CRT) is a cross-disciplinary examination, by social and civil-rights scholars and activists, of how laws, social and political movements, and media shape, and are shaped by, social conceptions of race and ethnicity.
Goals include challenging all mainstream and "alternative" views of racism and racial justice, including conservative, liberal, and progressive. The word critical in the name is an academic reference to critical thinking, critical theory, and scholarly criticism, rather than criticizing or blaming people.
CRT is also used in sociology to explain social, political, and legal structures and power distribution as through a "lens" focusing on the concept of race, and experiences of racism. For example, the CRT conceptual framework examines racial bias in laws and legal institutions, such as highly disparate rates of incarceration among racial groups in the United States.
A key CRT concept is intersectionality—the way in which different forms of inequality and identity are affected by interconnections of race, class, gender, and disability. Scholars of CRT view race as a social construct with no biological basis.
One tenet of CRT is that racism and disparate racial outcomes are the result of complex, changing, and often subtle social and institutional dynamics, rather than explicit and intentional prejudices of individuals.
CRT scholars argue that the social and legal construction of race advances the interests of White people at the expense of people of color, and that the liberal notion of U.S. law as "neutral" plays a significant role in maintaining a racially unjust social order, where formally color-blind laws continue to have racially discriminatory outcomes.
CRT began in the United States in the post–civil rights era, as 1960s landmark civil rights laws were being eroded and schools were being re-segregated. With racial inequalities persisting even after civil rights legislation and color-blind laws were enacted, CRT scholars in the 1970s and 1980s began reworking and expanding critical legal studies (CLS) theories on class, economic structure, and the law to examine the role of U.S. law in perpetuating racism.
CRT, a framework of analysis grounded in critical theory, originated in the mid-1970s in the writings of several American legal scholars, including:
CRT draws from the work of thinkers such as:
Academic critics of CRT argue it is based on storytelling instead of evidence and reason, rejects truth and merit, and opposes liberalism.
Since 2020, conservative U.S. lawmakers have sought to ban or restrict the instruction of CRT along with other critical education in primary and secondary schools, as well as relevant training inside federal agencies. Advocates of such bans argue that CRT is false, anti-American, villainizes White people, promotes radical leftism, and indoctrinates children.
Advocates of bans on CRT have been accused of misrepresenting the tenets and importance of CRT and of having the goal of broadly silencing discussions of racism, equality, social justice, and the history of race.
Definitions:
In his introduction to the comprehensive 1995 publication of critical race theory's key writings, Cornel West described CRT as "an intellectual movement that is both particular to our postmodern (and conservative) times and part of a long tradition of human resistance and liberation."
Law professor Roy L. Brooks defined critical race theory in 1994 as "a collection of critical stances against the existing legal order from a race-based point of view".
Gloria Ladson-Billings, who—along with co-author William Tate—had introduced CRT to the field of education in 1995, described it in 2015 as an "interdisciplinary approach that seeks to understand and combat race inequity in society." Ladson-Billings wrote in 1998 that CRT "first emerged as a counter-legal scholarship to the positivist and liberal legal discourse of civil rights."
In 2017, University of Alabama School of Law professor Richard Delgado, a co-founder of critical race theory, and legal writer Jean Stefancic define CRT as "a collection of activists and scholars interested in studying and transforming the relationship among race, racism, and power".
In 2021, Khiara Bridges, a law professor and author of the textbook Critical Race Theory: A Primer, defined critical race theory as an "intellectual movement", a "body of scholarship", and an "analytical tool-set for interrogating the relationship between law and racial inequality."
The 2021 Encyclopaedia Britannica described CRT as an "intellectual and social movement and loosely organized framework of legal analysis based on the premise that race is not a natural, biologically grounded feature of physically distinct subgroups of human beings but a socially constructed (culturally invented) category that is used to oppress and exploit people of colour."
TenetsScholars of CRT say that race is not "biologically grounded and natural"; rather, it is a socially constructed category used to oppress and exploit people of color; and that racism is not an aberration, but a normalized feature of American society. According to CRT, negative stereotypes assigned to members of minority groups benefit white people and increase racial oppression.
Individuals can belong to a number of different identity groups. The concept of intersectionality—one of CRT's main concepts—was introduced by legal scholar Kimberlé Crenshaw.
Derrick Albert Bell Jr. (1930 – 2011), an American lawyer, professor, and civil rights activist, writes that racial equality is "impossible and illusory" and that racism in the U.S. is permanent. According to Bell, civil-rights legislation will not on its own bring about progress in race relations; alleged improvements or advantages to people of color "tend to serve the interests of dominant white groups", in what Bell calls "interest convergence".
These changes do not typically affect—and at times even reinforce—racial hierarchies. This is representative of the shift in the 1970s, in Bell's re-assessment of his earlier desegregation work as a civil rights lawyer. He was responding to the Supreme Court's decisions that had resulted in the re-segregation of schools.
The concept of standpoint theory became particularly relevant to CRT when it was expanded to include a black feminist standpoint by Patricia Hill Collins. First introduced by feminist sociologists in the 1980s, standpoint theory holds that people in marginalized groups, who share similar experiences, can bring a collective wisdom and a unique voice to discussions on decreasing oppression.
In this view, insights into racism can be uncovered by examining the nature of the U.S. legal system through the perspective of the everyday lived experiences of people of color.
According to Encyclopedia Britannica, tenets of CRT have spread beyond academia, and are used to deepen understanding of socio-economic issues such as "poverty, police brutality, and voting rights violations", that are impacted by the ways in which race and racism are "understood and misunderstood" in the United States.
Common themes:
Richard Delgado and Jean Stefancic published an annotated bibliography of CRT references in 1993, listing works of legal scholarship that addressed one or more of the following themes:
When Gloria Ladson-Billings introduced CRT into education in 1995, she cautioned that its application required a "thorough analysis of the legal literature upon which it is based".
Critique of liberalism:
First and foremost to CRT legal scholars in 1993 was their "discontent" with the way in which liberalism addressed race issues in the U.S. They critiqued "liberal jurisprudence", including affirmative action, color-blindness, role modeling, and the merit principle.
Specifically, they claimed that the liberal concept of value-neutral law contributed to maintenance of the U.S.'s racially unjust social order.
An example questioning foundational liberal conceptions of Enlightenment values, such as rationalism and progress, is Rennard Strickland's 1986 Kansas Law Review article, "Genocide-at-Law: An Historic and Contemporary View of the Native American Experience". In it, he "introduced Native American traditions and world-views" into law school curriculum, challenging the entrenchment at that time of the "contemporary ideas of progress and enlightenment".
He wrote that U.S. laws that "permeate" the everyday lives of Native Americans were in "most cases carried out with scrupulous legality" but still resulted in what he called "cultural genocide".
In 1993, David Theo Goldberg described how countries that adopt classical liberalism's concepts of "individualism, equality, and freedom"—such as the United States and European countries—conceal structural racism in their cultures and languages, citing terms such as "Third World" and "primitive".
In 1988, Kimberlé Williams Crenshaw traced the origins of the New Right's use of the concept of color-blindness from 1970s neoconservative think tanks to the Ronald Reagan administration in the 1980s. She described how prominent figures such as neoconservative scholars Thomas Sowell and William Bradford Reynolds, who served as Assistant Attorney General for the Civil Rights Division from 1981 to 1988, called for "strictly color-blind policies".
Sowell and Reynolds, like many conservatives at that time, believed that the goal of equality of the races had already been achieved, and therefore the race-specific civil rights movement was a "threat to democracy". The color-blindness logic used in "reverse discrimination" arguments in the post-civil rights period is informed by a particular viewpoint on "equality of opportunity", as adopted by Sowell, in which the state's role is limited to providing a "level playing field", not to promoting equal distribution of resources.
Crenshaw claimed that "equality of opportunity" in antidiscrimination law can have both an expansive and a restrictive aspect. Crenshaw wrote that formally color-blind laws continue to have racially discriminatory outcomes.
According to her, this use of formal color-blindness rhetoric in claims of reverse discrimination, as in the 1978 Supreme Court ruling on Bakke, was a response to the way in which the courts had aggressively imposed affirmative action and busing during the Civil Rights era, even on those who were hostile to those issues.
In 1990, legal scholar Duncan Kennedy described the dominant approach to affirmative action in legal academia as "colorblind meritocratic fundamentalism". He called for a postmodern "race consciousness" approach that included "political and cultural relations" while avoiding "racialism" and "essentialism".
Sociologist Eduardo Bonilla-Silva describes this newer, subtle form of racism as "color-blind racism", which uses frameworks of abstract liberalism to decontextualize race, naturalize outcomes such as segregation in neighborhoods, attribute certain cultural practices to race, and cause "minimization of racism".
In his influential 1984 article, Delgado challenged the liberal concept of meritocracy in civil rights scholarship. He questioned how the top articles in most well-established journals were all written by white men.
Storytelling/counter-storytelling and "naming one's own reality":
The use of narrative (storytelling) to illuminate and explore lived experiences of racial oppression.
One of the prime tenets of liberal jurisprudence is that people can create appealing narratives to think and talk about greater levels of justice. Delgado and Stefancic call this the empathic fallacy—the belief that it is possible to "control our consciousness" by using language alone to overcome bigotry and narrow-mindedness.
They examine how people of color, considered outsiders in mainstream US culture, are portrayed in media and law through stereotypes and stock characters that have been adapted over time to shield the dominant culture from discomfort and guilt.
For example, slaves in the 18th-century Southern States were depicted as childlike and docile; Harriet Beecher Stowe adapted this stereotype through her character Uncle Tom, depicting him as a "gentle, long-suffering", pious Christian.
Following the Civil War, the African-American woman was depicted as a wise, care-giving "Mammy" figure.
During the Reconstruction period, African-American men were stereotyped as "brutish and bestial", a danger to white women and children. This was exemplified in Thomas Dixon Jr.'s novels, used as the basis for the epic film The Birth of a Nation, which celebrated the Ku Klux Klan and lynching.
During the Harlem Renaissance, African-Americans were depicted as "musically talented" and "entertaining". Following World War II, when many Black veterans joined the nascent civil rights movement, African Americans were portrayed as "cocky [and] street-smart", the "unreasonable, opportunistic" militant, the "safe, comforting, cardigan-wearing" TV sitcom character, and the "super-stud" of blaxploitation films.
The empathic fallacy informs the "time-warp aspect of racism", where the dominant culture can see racism only through the hindsight of a past era or distant land, such as South Africa.
Through centuries of stereotypes, racism has become normalized; it is a "part of the dominant narrative we use to interpret experience". Delgado and Stefancic argue that speech alone is an ineffective tool to counter racism, since the system of free expression tends to favor the interests of powerful elites and to assign responsibility for racist stereotypes to the "marketplace of ideas".
In the decades following the passage of civil rights laws, acts of racism had become less overt and more covert—invisible to, and underestimated by, most of the dominant culture.
Since racism makes people feel uncomfortable, the empathic fallacy helps the dominant culture to mistakenly believe that it no longer exists, and that dominant images, portrayals, stock characters, and stereotypes—which usually portray minorities in a negative light—provide them with a true image of race in America.
Based on these narratives, the dominant group has no need to feel guilty or to make an effort to overcome racism, as it feels "right, customary, and inoffensive to those engaged in it", while self-described liberals who uphold freedom of expression can feel virtuous while maintaining their own superior position.
Bryan Brayboy has emphasized the epistemic importance of storytelling in Indigenous-American communities as superseding that of theory, and has proposed a --Tribal Critical Race Theory (TribCrit).
Standpoint epistemology:
The view that a members of racial minority groups have a unique authority and ability to speak about racism. This is seen as undermining dominant narratives relating to racial inequality, such as legal neutrality and personal responsibility or bootstrapping, through valuable first-hand accounts of the experience of racism.
Revisionist interpretations of American civil rights law and progress:
Interest convergence is a concept introduced by Derrick Bell in his 1980 Harvard Law Review article, "Brown v. Board of Education and the Interest-Convergence Dilemma".
In this article, Bell described how he re-assessed the impact of the hundreds of NAACP LDF de-segregation cases he won from 1960 to 1966, and how he began to believe that in spite of his sincerity at the time, anti-discrimination law had not resulted in improving Black children's access to quality education.
He listed and described how Supreme Court cases had gutted civil rights legislation, which had resulted in African-American students continuing to attend all-black schools that lacked adequate funding and resources. In examining these Supreme Court cases, Bell concluded that the only civil-rights legislation that was passed coincided with the self-interest of white people, which Bell termed interest convergence.
One of the best-known examples of interest convergence is the way in which American geopolitics during the Cold War in the aftermath of World War II was a critical factor in the passage of civil rights legislation by both Republicans and Democrats. Bell described this in numerous articles, including the aforementioned, and it was supported by the research and publications of legal scholar Mary L. Dudziak.
In her journal articles and her 2000 book Cold War Civil Rights—based on newly released documents—Dudziak provided detailed evidence that it was in the interest of the United States to quell the negative international press about treatment of African-Americans when the majority of the populations of newly decolonized countries which the U.S. was trying to attract to Western-style democracy, were not white.
The U.S. sought to promote liberal values throughout Africa, Asia, and Latin America to prevent the Soviet Union from spreading communism. Dudziak described how the international press widely circulated stories of segregation and violence against African-Americans.
The Moore's Ford lynchings, where a World War II veteran was lynched, were particularly widespread in the news. American allies followed stories of American racism through the international press, and the Soviets used stories of racism against Black Americans as a vital part of their propaganda.
Dudziak performed extensive archival research in the U.S. Department of State and Department of Justice and concluded that U.S. government support for civil-rights legislation "was motivated in part by the concern that racial discrimination harmed the United States' foreign relations".
When the National Guard was called in to prevent nine African-American students from integrating the Little Rock Central High School, the international press covered the story extensively. The then-Secretary of State told President Dwight Eisenhower that the Little Rock situation was "ruining" American foreign policy, particularly in Asia and Africa.
The U.S.'s ambassador to the United Nations told President Eisenhower that as two-thirds of the world's population was not white, he was witnessing their negative reactions to American racial discrimination. He suspected that the U.S. "lost several votes on the Chinese communist item because of Little Rock."
Intersectional theory:
The examination of race, sex, class, national origin, and sexual orientation, and how their intersections play out in various settings, such as how the needs of a Latina are different from those of a Black male, and whose needs are promoted.
These intersections provide a more holistic picture for evaluating different groups of people. Intersectionality is a response to identity politics insofar as identity politics does not take into account the different intersections of people's identities.
Essentialism vs. anti-essentialism:
Delgado and Stefancic write, "Scholars who write about these issues are concerned with the appropriate unit for analysis: Is the black community one, or many, communities? Do middle- and working-class African-Americans have different interests and needs?
Do all oppressed peoples have something in common?" This is a look at the ways that oppressed groups may share in their oppression but also have different needs and values that need to be analyzed differently. It is a question of how groups can be essentialized or are unable to be essentialized.
From a essentialist perspective, one's identity consists of an internal "essence" that is static and unchanging from birth, whereas a non-essentialist position holds that "the subject has no fixed or permanent identity.”
Racial essentialism diverges into biological and cultural essentialism, where subordinated groups may endorse one over the other. "Cultural and biological forms of racial essentialism share the idea that differences between racial groups are determined by a fixed and uniform essence that resides within and defines all members of each racial group. However, they differ in their understanding of the nature of this essence."
Subordinated communities may be more likely to endorse cultural essentialism as it provides a basis of positive distinction for establishing a cumulative resistance as a means to assert their identities and advocacy of rights, whereas biological essentialism may be unlikely to resonate with marginalized groups as historically, dominant groups have used genetics and biology in justifying racism and oppression.
Essentialism is the idea of a singular, shared experience between a specific group of people. Anti-essentialism, on the other hand, believes that there are other various factors that can affect a person’s being and their overall life experience. The race of an individual is viewed more as a social construct that does not necessarily dictate the outcome of their life circumstances.
Race is viewed as “a social and historical construction, rather than an inherent, fixed, essential biological characteristic.” Anti-essentialism “forces a destabilization in the very concept of race itself…” The results of this destabilization vary on the analytic focus falling into two general categories, “... consequences for the analytic concepts of racial identity or racial subjectivity.”
Structural determinism, and race, sex, class, and their intersections:
Exploration of how "the structure of legal thought or culture influences its content" in a way that determines social outcomes. Delgado and Stefancic cited "empathic fallacy" as one example of structural determinism—the "idea that our system, by reason of its structure and vocabulary, cannot redress certain types of wrong."
They interrogate the absence of terms such as intersectionality, anti-essentialism, and jury nullification in standard legal reference research tools in law libraries.
Cultural nationalism/separatism:
The exploration of more radical views that argue for separation and reparations as a form of foreign aid (including black nationalism).
Legal institutions, critical pedagogy, and minorities in the bar:
Camara Phyllis Jones defines institutionalized racism as "differential access to the goods, services, and opportunities of society by race. Institutionalized racism is normative, sometimes legalized and often manifests as inherited disadvantage.
It is structural, having been absorbed into our institutions of custom, practice, and law, so there need not be an identifiable offender. Indeed, institutionalized racism is often evident as inaction in the face of need, manifesting itself both in material conditions and in access to power.
With regard to the former, examples include differential access to quality education, sound housing, gainful employment, appropriate medical facilities, and a clean environment."
Black-white binary:
The black-white binary is a paradigm identified by legal scholars through which racial issues and histories are typically articulated within a racial binary between Black and white Americans.
The binary largely governs how race has been portrayed and addressed throughout U.S. history. Critical race theorists Richard Delgado and Jean Stefancic argue that anti-discrimination law has blindspots for non-black minorities due to its language being confined within the black-white binary.
Applications and adaptations:
Scholars of critical race theory have focused, with some particularity, on the issues of hate crime and hate speech. In response to the opinion of the U.S. Supreme Court in the hate speech case of R.A.V. v. City of St. Paul (1992), in which the Court struck down an anti-bias ordinance as applied to a teenager who had burned a cross, Mari Matsuda and Charles Lawrence argued that the Court had paid insufficient attention to the history of racist speech and the actual injury produced by such speech.
Critical race theorists have also argued in favor of affirmative action. They propose that so-called merit standards for hiring and educational admissions are not race-neutral and that such standards are part of the rhetoric of neutrality through which whites justify their disproportionate share of resources and social benefits.
In his 2009 article "Will the Real CRT Please Stand Up: The Dangers of Philosophical Contributions to CRT", Curry distinguished between the original CRT key writings and what is being done in the name of CRT by a "growing number of white feminists".
The new CRT movement "favors narratives that inculcate the ideals of a post-racial humanity and racial amelioration between compassionate (Black and White) philosophical thinkers dedicated to solving America's race problem."
They are interested in discourse (i.e., how individuals speak about race) and the theories of white Continental philosophers, over and against the structural and institutional accounts of white supremacy which were at the heart of the realist analysis of racism introduced in Derrick Bell's early works, and articulated through such African-American thinkers as W. E. B. Du Bois, Paul Robeson, and Judge Robert L. Carter.
History:
Early years:
Although the terminology critical race theory began in its application to laws, the subject emerges out of the broader frame of critical theory in how it analyzes power structures in society despite whatever laws may be in effect.
In the 1998 article, "Critical Race Theory: Past, Present, and Future", Delgado and Stefancic trace the origins of CRT to the early writings of Derrick Albert Bell Jr. including his 1976 Yale Law Journal article, "Serving Two Masters" and his 1980 Harvard Law Review article entitled "Brown v. Board of Education and the Interest-Convergence Dilemma".
In the 1970s, as a professor at Harvard Law School Bell began to critique, question and re-assess the civil rights cases he had litigated in the 1960s to desegregate schools following the passage of Brown v. Board of Education. This re-assessment became the "cornerstone of critical race theory".
Delgado and Stefancic, who together wrote Critical Race Theory: a Introduction in 2001, described Bell's "interest convergence" as a "means of understanding Western racial history". The focus on desegregation after the 1954 Supreme Court decision in Brown—declaring school segregation unconstitutional—left "civil-rights lawyers compromised between their clients' interests and the law".
The concern of many Black parents—for their children's access to better education—was being eclipsed by the interests of litigators who wanted a "breakthrough" in their "pursuit of racial balance in schools". In 1995, Cornel West said that Bell was "virtually the lone dissenter" writing in leading law reviews who challenged basic assumptions about how the law treated people of color.
In his Harvard Law Review articles, Bell cites the 1964 Hudson v. Leake County School Board case which the NAACP Legal Defense and Educational Fund (NAACP LDF) won, mandating that the all-white school board comply with desegregation. At that time it was seen as a success.
By the 1970s, White parents were removing their children from the desegregated schools and enrolling them in segregation academies. Bell came to believe that he had been mistaken in 1964 when, as a young lawyer working for the LDF, he had convinced Winson Hudson, who was the head of the newly formed local NAACP chapter in Harmony, Mississippi, to fight the all-White Leake County School Board to desegregate schools.
She and the other Black parents had initially sought LDF assistance to fight the board's closure of their school—one of the historic Rosenwald Schools for Black children.
Bell explained to Hudson, that—following Brown—the LDF could not fight to keep a segregated Black school open; they would have to fight for desegregation. In 1964, Bell and the NAACP had believed that resources for desegregated schools would be increased and Bell explained to Hudson, that—following Brown—the LDF could not fight to keep a segregated Black school open; they would have to fight for desegregation.
In 1964, Bell and the NAACP had believed that resources for desegregated schools would be increased and Black children would access higher quality education, since White parents would insist on better quality schools; by the 1970s, Black children were again attending segregated schools and the quality of education had deteriorated.
Bell began to work for the NAACP LDF shortly after the Montgomery bus boycott and the ensuing 1956 Supreme Court ruling following Browder v. Gayle that the Alabama and Montgomery bus segregation laws were unconstitutional. From 1960 to 1966 Bell successfully litigated 300 civil rights cases in Mississippi.
Bell was inspired by Thurgood Marshall, who had been one of the two leaders of a decades-long legal campaign starting in the 1930s, in which they filed hundreds of lawsuits to reverse the "separate but equal" doctrine announced by the Supreme Court's decision in Plessy v. Ferguson (1896).
The Court ruled that racial segregation laws enacted by the states were not in violation of the United States Constitution as long as the facilities for each race were equal in quality. The Plessy decision provided the legal mandate at the federal level to enforce Jim Crow laws that had been introduced by white Southern Democrats starting in the 1870s for racial segregation in all public facilities, including public schools.
The Court's 1954 Brown decision—which held that the "separate but equal" doctrine is unconstitutional in the context of public schools and educational facilities—severely weakened Plessy. The Supreme Court concept of constitutional colorblindness in regards to case evaluation began with Plessy.
Before Plessy, the Court considered color as a determining factor in many landmark cases, which reinforced Jim Crow laws. Bell's 1960s civil rights work built on Justice Marshall's groundwork begun in the 1930s. It was a time when the legal branch of the civil rights movement was launching thousands of civil rights cases. It was a period of idealism for the civil rights movement.
At Harvard, Bell developed new courses that studied American law through a racial lens. He compiled his own course materials which were published in 1970 under the title Race, Racism, and American Law. He became Harvard Law School's first Black tenured professor in 1971.
During the 1970s, the courts were using legislation to enforce affirmative action programs and busing—where the courts mandated busing to achieve racial integration in school districts that rejected desegregation. In response, in the 1970s, neoconservative think tanks—hostile to these two issues in particular—developed a color-blind rhetoric to oppose them, claiming they represented reverse discrimination.
In 1978, Regents of the University of California v. Bakke, when Bakke won this landmark Supreme Court case by using the argument of reverse racism, Bell's skepticism that racism would end increased. Justice Lewis F. Powell Jr. held that the "guarantee of equal protection cannot mean one thing when applied to one individual and something else when applied to a person of another color."
In a 1979 article, Bell asked if there were any groups of the White population that would be willing to suffer any disadvantage that might result from the implementation of a policy to rectify harms to Black people resulting from slavery, segregation, or discrimination.
Bell resigned in 1980 because of what he viewed as the university's discriminatory practices, became the dean at University of Oregon School of Law and later returned to Harvard as a visiting professor.
While he was absent from Harvard, his supporters organized protests against Harvard's lack of racial diversity in the curriculum, in the student body and in the faculty. The university had rejected student requests, saying no sufficiently qualified black instructor existed.
Legal scholar Randall Kennedy writes that some students had "felt affronted" by Harvard's choice to employ an "archetypal white liberal... in a way that precludes the development of black leadership".
One of these students was Kimberlé Crenshaw, who had chosen Harvard in order to study under Bell; she was introduced to his work at Cornell. Crenshaw organized the student-led initiative to offer an alternative course on race and law in 1981—based on Bell's course and textbook—where students brought in visiting professors, such as Charles Lawrence, Linda Greene, Neil Gotanda, and Richard Delgado, to teach chapter-by-chapter from Race, Racism, and American Law.
Critical race theory emerged as an intellectual movement with the organization of this boycott; CRT scholars included graduate law students and professors.
Alan Freeman was a founding member of the Critical Legal Studies (CLS) movement that hosted forums in the 1980s. CLS legal scholars challenged claims to the alleged value-neutral position of the law. They criticized the legal system's role in generating and legitimizing oppressive social structures which contributed to maintaining an unjust and oppressive class system.
Delgado and Stefancic cite the work of Alan Freeman in the 1970s as formative to critical race theory. In his 1978 Minnesota Law Review article Freeman reinterpreted, through a critical legal studies perspective, how the Supreme Court oversaw civil rights legislation from 1953 to 1969 under the Warren Court. He criticized the narrow interpretation of the law which denied relief for victims of racial discrimination.
In his article, Freeman describes two perspectives on the concept of racial discrimination: that of victim or perpetrator. Racial discrimination to the victim includes both objective conditions and the "consciousness associated with those objective conditions".
To the perpetrator, racial discrimination consists only of actions without consideration of the objective conditions experienced by the victims, such as the "lack of jobs, lack of money, lack of housing". Only those individuals who could prove they were victims of discrimination were deserving of remedies.
By the late 1980s, Freeman, Bell, and other CRT scholars left the CLS movement claiming it was too narrowly focused on class and economic structures while neglecting the role of race and race relations in American law.
Emergence as a movement:
In 1989, Kimberlé Crenshaw, Neil Gotanda, and Stephanie Phillips organized a workshop at the University of Wisconsin-Madison entitled "New Developments in Critical Race Theory". The organizers coined the term "Critical Race Theory" to signify an "intersection of critical theory and race, racism and the law."
Afterward, legal scholars began publishing a higher volume of works employing critical race theory, including more than "300 leading law review articles" and books. In 1990, Duncan Kennedy published his article on affirmative action in legal academia in the Duke Law Journal, and Anthony E. Cook published his article "Beyond Critical Legal Studies" in the Harvard Law Review.
In 1991, Patricia Williams published The Alchemy of Race and Rights, while Derrick Bell published Faces at the Bottom of the Well in 1992. Cheryl I. Harris published her 1993 Harvard Law Review article "Whiteness as Property" in which she described how passing led to benefits akin to owning property. In 1995, two dozen legal scholars contributed to a major compilation of key writings on CRT.
By the early 1990s, key concepts and features of CRT had emerged. Bell had introduced his concept of "interest convergence" in his 1973 article. He developed the concept of racial realism in a 1992 series of essays and book, Faces at the bottom of the well: the permanence of racism.
He said that Black people needed to accept that the civil rights era legislation would not on its own bring about progress in race relations; anti-Black racism in the U.S. was a "permanent fixture" of American society; and equality was "impossible and illusory" in the US. Crenshaw introduced the term intersectionality in the 1990s.
In 1995, pedagogical theorists Gloria Ladson-Billings and William F. Tate began applying the critical race theory framework in the field of education. In their 1995 article Ladson-Billings and Tate described the role of the social construction of white norms and interests in education. They sought to better understand inequities in schooling.
Scholars have since expanded work to explore issues including:
As of 2002, over 20 American law schools and at least three non-American law schools offered critical race theory courses or classes.
Critical race theory is also applied in the fields of:
Other movements developed that apply critical race theory to specific groups. These include the Latino-critical (LatCrit), queer-critical, and Asian-critical movements. These continued to engage with the main body of critical theory research, over time developing independent priorities and research methods.
CRT has also been taught internationally, including in the United Kingdom (UK) and Australia. According to educational researcher Mike Cole, the main proponents of CRT in the UK include David Gillborn, John Preston, and Namita Chakrabarty.
Philosophical foundations:
CRT scholars draw on the work of Antonio Gramsci, Sojourner Truth, Frederick Douglass, and W. E. B. DuBois. Bell shared Paul Robeson's belief that "Black self-reliance and African cultural continuity should form the epistemic basis of Blacks' worldview." Their writing is also informed by the 1960s and 1970s movements such as:
Critical race theory shares many intellectual commitments with critical theory, critical legal studies, feminist jurisprudence, and postcolonial theory.
University of Connecticut philosopher, Lewis Gordon, who has focused on postcolonial phenomenology, and race and racism, wrote that:
Standpoint theory, which has been adopted by some CRT scholars, emerged from the first wave of the women's movement in the 1970s. The main focus of feminist standpoint theory is epistemology—the study of how knowledge is produced. The term was coined by Sandra Harding, an American feminist theorist, and developed by Dorothy Smith in her 1989 publication, The Everyday World as Problematic: A Feminist Sociology.
Smith wrote that by studying how women socially construct their own everyday life experiences, sociologists could ask new questions. Patricia Hill Collins introduced black feminist standpoint—a collective wisdom of those who have similar perspectives in society which sought to heighten awareness to these marginalized groups and provide ways to improve their position in society.
Critical race theory draws on the priorities and perspectives of both critical legal studies (CLS) and conventional civil rights scholarship, while also sharply contesting both of these fields. UC Davis School of Law legal scholar Angela P. Harris, describes critical race theory as sharing "a commitment to a vision of liberation from racism through right reason" with the civil rights tradition.
CRT deconstructs some premises and arguments of legal theory and simultaneously holds that legally constructed rights are incredibly important. CRT scholars disagreed with the CLS anti-legal rights stance, nor did they wish to "abandon the notions of law" completely; CRT legal scholars acknowledged that some legislation and reforms had helped people of color.
As described by Derrick Bell, critical race theory in Harris' view is committed to "radical critique of the law (which is normatively deconstructionist) and... radical emancipation by the law (which is normatively reconstructionist)".
University of Edinburgh philosophy professor Tommy J. Curry says that by 2009, the CRT perspective on a race as a social construct was accepted by "many race scholars" as a "commonsense view" that race is not "biologically grounded and natural."
Social construct is a term from social constructivism, whose roots can be traced to the early science wars, instigated in part by Thomas Kuhn's 1962 The Structure of Scientific Revolutions.
Ian Hacking, a Canadian philosopher specializing in the philosophy of science, describes how social construction has spread through the social sciences. He cites the social construction of race as an example, asking how race could be "constructed" better.
Criticism:
Academic criticism:
According to the Encyclopaedia Britannica, aspects of CRT have been criticized by "legal scholars and jurists from across the political spectrum." Criticism of CRT has focused on its emphasis on storytelling, its critique of the merit principle and of objective truth, and its thesis of the voice of color.
Critics say it contains a "postmodernist-inspired skepticism of objectivity and truth", and has a tendency to interpret "any racial inequity or imbalance [...] as proof of institutional racism and as grounds for directly imposing racially equitable outcomes in those realms", according to Britannica. Proponents of CRT have also been accused of treating even well-meaning criticism of CRT as evidence of latent racism.
In a 1997 book, law professors Daniel A. Farber and Suzanna Sherry criticized CRT for basing its claims on personal narrative and for its lack of testable hypotheses and measurable data. CRT scholars including Crenshaw, Delgado, and Stefancic responded that such critiques represent dominant modes within social science which tend to exclude people of color.
Delgado and Stefancic wrote that "In these realms [social science and politics], truth is a social construct created to suit the purposes of the dominant group." Farber and Sherry have also argued that anti-meritocratic tenets in critical race theory, critical feminism, and critical legal studies may unintentionally lead to antisemitic and anti-Asian implications.
They write that the success of Jews and Asians within what critical race theorists posit to be a structurally unfair system may lend itself to allegations of cheating and advantage-taking.
In response, Delgado and Stefancic write that there is a difference between criticizing an unfair system and criticizing individuals who perform well inside that system.
Public controversies:
See also: 2020s controversies around critical race theory
Australia:
In June 2021, following media reports that the proposed national curriculum was "preoccupied with the oppression, discrimination and struggles of Indigenous Australians", the Australian Senate approved a motion tabled by right-wing senator Pauline Hanson calling on the federal government to reject CRT, despite it not being included in the curriculum.
Despite this, CRT is gaining increasing popularity in Australian academic circles, to investigate indigenous issues/studies, anti-Muslim racism and Black Africans' experiences.
United Kingdom:
Conservatives within the UK government began to criticize CRT in late 2020. Equalities Minister Kemi Badenoch, who is of Nigerian descent, said during a parliamentary debate to mark Black History Month, "We do not want to see teachers teaching their pupils about white privilege and inherited racial guilt [...]
Any school which teaches these elements of critical race theory, or which promotes partisan political views such as defunding the police without offering a balanced treatment of opposing views, is breaking the law."
In an open letter, 101 writers of the Black Writers' Guild denounced Badenoch for remarks about popular anti-racism books such as White Fragility and Why I'm No Longer Talking to White People About Race, made in an interview in The Spectator, in which she said, "many of these books—and, in fact, some of the authors and proponents of critical race theory—actually want a segregated society".
United States:
Critical race theory has stirred controversy in the United States for promoting the use of narrative in legal studies, advocating "legal instrumentalism" as opposed to ideal-driven uses of the law, and encouraging legal scholars to promote racial equity.
Before 1993, the term "critical race theory" was not part of public discourse. In the spring of that year, conservatives launched a campaign led by Clint Bolick to portray Lani Guinier—then-President Bill Clinton's nominee for Assistant Attorney General for Civil Rights—as a radical because of her connection to CRT.
Within months, Clinton had withdrawn the nomination, describing the effort to stop Guinier's appointment as "a campaign of right-wing distortion and vilification".
Amy E. Ansell writes that the logic of legal instrumentalism reached wide public reception in the O. J. Simpson murder case when attorney Johnnie Cochran "enacted a sort of applied CRT", selecting an African-American jury and urging them to acquit Simpson in spite of the evidence against him—a form of jury nullification.
Legal scholar Jeffrey Rosen calls this the "most striking example" of CRT's influence on the U.S. legal system. Law professor Margaret M. Russell responded to Rosen's assertion in the Michigan Law Review, saying that Cochran's "dramatic" and "controversial" courtroom "style and strategic sense" in the Simpson case resulted from his decades of experience as an attorney; it was not significantly influenced by CRT writings.
In 2010, a Mexican-American studies program in Tucson, Arizona, was halted because of a state law forbidding public schools from offering race-conscious education in the form of "advocat[ing] ethnic solidarity instead of the treatment of pupils as individuals". Certain books, including a primer on CRT, were banned from the curriculum.
Matt de la Peña's young-adult novel Mexican WhiteBoy was banned for "containing 'critical race theory'" according to state officials. The ban on ethnic-studies programs was later deemed unconstitutional on the grounds that the state showed discriminatory intent: "Both enactment and enforcement were motivated by racial animus", federal Judge A. Wallace Tashima ruled.
In the run-up to and aftermath of the 2020 U.S. presidential election, opposition to critical race theory was adopted as a campaign theme by Donald Trump and various conservative commentators on Fox News and right-wing talk radio shows.
In September 2020, after seeing a piece on Fox News in which conservative activist Christopher Rufo denounced CRT, Trump issued an executive order directing agencies of the United States federal government to cancel funding for programs that mention "white privilege" or "critical race theory", on the basis that it constituted "divisive, un-American propaganda" and that it was "racist".
In a speech on September 17, 2020, Trump denounced critical race theory and announced the formation of the 1776 Commission to promote "patriotic education". On January 20, 2021, Joe Biden rescinded Trump's order and dissolved the 1776 Commission.
Opposition to what was purported to be critical race theory was subsequently adopted as a major theme by several conservative think tanks and pressure groups, including the Heritage Foundation, the Idaho Freedom Foundation, the American Legislative Exchange Council and organizations funded by the Koch brothers.
According to The Washington Post, conservative lawmakers and activists have used the term as "a catchall phrase for nearly any examination of systemic racism". Rufo wrote on Twitter, "The goal is to have the public read something crazy in the newspaper and immediately think 'critical race theory'."
State-level legislation:
Main article: Censorship of school curricula in the United States
In early 2021, Republican-backed bills were introduced to restrict teaching about race, ethnicity, or slavery in public schools in several states, including:
Several of these bills specifically mention "critical race theory" or single out The New York Times' 1619 Project. CRT is taught at the university level, and public school teachers do not generally use the phrase "Critical Race Theory" or its legal frameworks.
In mid-April 2021, a bill was introduced in the Idaho Legislature that would effectively ban any educational entity from teaching or advocating "sectarianism", including critical race theory or other programs involving social justice. On May 4, 2021, the bill was signed into law by Governor Brad Little.
On June 10, 2021, the Florida Board of Education unanimously voted to ban public schools from teaching critical race theory at the urging of governor Ron DeSantis.
As of July 2021, 10 U.S. states have introduced bills or taken other steps that would restrict teaching critical race theory, and 26 others were in the process of doing so.
In June 2021, the American Association of University Professors, the American Historical Association, the Association of American Colleges and Universities, and PEN America released a joint statement stating their opposition to such legislation, and by August 2021, 167 professional organizations had signed onto the statement.
In August 2021, the Brookings Institution recorded that eight states—Idaho, Oklahoma, Tennessee, Texas, Iowa, New Hampshire, Arizona, and South Carolina—had passed regulation on the issue, though also noted that none of the bills that passed, with the exception of Idaho's, actually contained the words "critical race theory".
Brookings also noted that these laws often extend beyond race to discussions of gender.
Timothy D. Snyder, historian and professor at Yale University, has called these new state laws memory laws — "government actions designed to guide public interpretation of the past".
Early memory laws were intended to protect victim groups, such as from revisionism attempts by holocaust deniers, but most recently have been used by Russia to protect "the feelings of the powerful", then by Donald Trump's 1776 Report in January 2021, followed by Republican-led legislatures submitting these bills.
Snyder called the Idaho version "Kafkaesque in its censorship: It affirms freedom of speech and then bans divisive speech."
As of December 2021, 66 educational gag orders had been filed for the year in 26 state legislatures (12 bills had already been passed into law) that would inhibit teaching any race theory in schools, universities, or state agencies, by teachers, employers or contractors.
Penalties vary, but predominantly include loss of funding for schools and institutions. However, in some cases the bills mandate firing of employees.
On January 15, 2022, his first day in office, governor of Virginia Glenn Youngkin signed multiple executive orders, including barring the teaching of critical race theory in public schools.
Subfields:
Within critical race theory, various sub-groupings focus on issues and nuances unique to particular ethno-racial and/or marginalized communities. This includes the intersection of race with disability, ethnicity, gender, sexuality, class, or religion.
For examples,
CRT methodologies have also been applied to the study of white immigrant groups.
CRT has spurred some scholars to call for a second wave of whiteness studies, which is now a small offshoot known as Second Wave Whiteness (SWW). Critical race theory has also begun to spawn research that looks at understandings of race outside the United States.
Disability critical race theory:
Another offshoot field is disability critical race studies (DisCrit), which combines disability studies and CRT to focus on the intersection of disability and race.
Latino critical race theory:
Latino critical race theory (LatCRT or LatCrit) is a research framework that outlines the social construction of race as central to how people of color are constrained and oppressed in society.
Race scholars developed LatCRT as a critical response to the "problem of the color line" first explained by W. E. B. Du Bois. While CRT focuses on the Black–White paradigm, LatCRT has moved to consider other racial groups, mainly Chicana/Chicanos, as well as Latinos/as, Asians, Native Americans/First Nations, and women of color.
In Critical Race Counterstories along the Chicana/Chicano Educational Pipeline, Tara J. Yosso discusses how the constraint of POC can be defined. Looking at the differences between Chicana/o students, the tenets that separate such individuals are: the intercentricity of race and racism, the challenge of dominant ideology, the commitment to social justice, the centrality of experience knowledge, and the interdisciplinary perspective.
LatCRTs main focus is to advocate social justice for those living in marginalized communities (specifically Chicana/os), who are guided by structural arrangements that disadvantage people of color. Social institutions function as dispossessions, disenfranchisement, and discrimination over minority groups, while LatCRT seeks to give voice to those who are victimized.
In order to do so, LatCRT has created two common themes:
First, CRT proposes that white supremacy and racial power are maintained over time, a process that the law plays a central role in. Different racial groups lack the voice to speak in this civil society, and, as such, CRT has introduced a new critical form of expression, called the voice of color.
The voice of color is narratives and storytelling monologues used as devices for conveying personal racial experiences. These are also used to counter metanarratives that continue to maintain racial inequality. Therefore, the experiences of the oppressed are important aspects for developing a LatCRT analytical approach, and it has not been since the rise of slavery that an institution has so fundamentally shaped the life opportunities of those who bear the label of criminal.
Second, LatCRT work has investigated the possibility of transforming the relationship between law enforcement and racial power, as well as pursuing a project of achieving racial emancipation and anti-subordination more broadly. Its body of research is distinct from general critical race theory in that it emphasizes immigration theory and policy, language rights, and accent- and national origin-based forms of discrimination.
CRT finds the experiential knowledge of people of color and draws explicitly from these lived experiences as data, presenting research findings through storytelling, chronicles, scenarios, narratives, and parables.
Asian critical race theory:
Asian critical race theory looks at the influence of race and racism on Asian Americans and their experiences in the U.S. education system. Like Latino critical race theory, Asian critical race theory is distinct from the main body of CRT in its emphasis on immigration theory and policy.
Critical philosophy of race:
The Critical Philosophy of Race (CPR) is inspired by both Critical Legal Studies and Critical Race Theory's use of interdisciplinary scholarship. Both CLS and CRT explore the covert nature of mainstream use of "apparently neutral concepts, such as merit or freedom."
See also:
noun
a set of ideas holding that racial bias is inherent in many parts of western society, especially in its legal and social institutions, on the basis of their having been primarily designed for and implemented by white people.
"I took a class in law school that examined case law through the lens of critical race theory"
___________________________________________________________________________
Critical Race Theory (Wikipedia):
Critical race theory (CRT) is a cross-disciplinary examination, by social and civil-rights scholars and activists, of how laws, social and political movements, and media shape, and are shaped by, social conceptions of race and ethnicity.
Goals include challenging all mainstream and "alternative" views of racism and racial justice, including conservative, liberal, and progressive. The word critical in the name is an academic reference to critical thinking, critical theory, and scholarly criticism, rather than criticizing or blaming people.
CRT is also used in sociology to explain social, political, and legal structures and power distribution as through a "lens" focusing on the concept of race, and experiences of racism. For example, the CRT conceptual framework examines racial bias in laws and legal institutions, such as highly disparate rates of incarceration among racial groups in the United States.
A key CRT concept is intersectionality—the way in which different forms of inequality and identity are affected by interconnections of race, class, gender, and disability. Scholars of CRT view race as a social construct with no biological basis.
One tenet of CRT is that racism and disparate racial outcomes are the result of complex, changing, and often subtle social and institutional dynamics, rather than explicit and intentional prejudices of individuals.
CRT scholars argue that the social and legal construction of race advances the interests of White people at the expense of people of color, and that the liberal notion of U.S. law as "neutral" plays a significant role in maintaining a racially unjust social order, where formally color-blind laws continue to have racially discriminatory outcomes.
CRT began in the United States in the post–civil rights era, as 1960s landmark civil rights laws were being eroded and schools were being re-segregated. With racial inequalities persisting even after civil rights legislation and color-blind laws were enacted, CRT scholars in the 1970s and 1980s began reworking and expanding critical legal studies (CLS) theories on class, economic structure, and the law to examine the role of U.S. law in perpetuating racism.
CRT, a framework of analysis grounded in critical theory, originated in the mid-1970s in the writings of several American legal scholars, including:
- Derrick Bell,
- Alan Freeman,
- Kimberlé Crenshaw,
- Richard Delgado,
- Cheryl Harris,
- Charles R. Lawrence III,
- Mari Matsuda,
- and Patricia J. Williams.
CRT draws from the work of thinkers such as:
- Antonio Gramsci,
- Sojourner Truth,
- Frederick Douglass,
- and W. E. B. Du Bois,
- as well as the
- Black Power,
- Chicano,
- and radical feminist movements from the 1960s and 1970s.
Academic critics of CRT argue it is based on storytelling instead of evidence and reason, rejects truth and merit, and opposes liberalism.
Since 2020, conservative U.S. lawmakers have sought to ban or restrict the instruction of CRT along with other critical education in primary and secondary schools, as well as relevant training inside federal agencies. Advocates of such bans argue that CRT is false, anti-American, villainizes White people, promotes radical leftism, and indoctrinates children.
Advocates of bans on CRT have been accused of misrepresenting the tenets and importance of CRT and of having the goal of broadly silencing discussions of racism, equality, social justice, and the history of race.
Definitions:
In his introduction to the comprehensive 1995 publication of critical race theory's key writings, Cornel West described CRT as "an intellectual movement that is both particular to our postmodern (and conservative) times and part of a long tradition of human resistance and liberation."
Law professor Roy L. Brooks defined critical race theory in 1994 as "a collection of critical stances against the existing legal order from a race-based point of view".
Gloria Ladson-Billings, who—along with co-author William Tate—had introduced CRT to the field of education in 1995, described it in 2015 as an "interdisciplinary approach that seeks to understand and combat race inequity in society." Ladson-Billings wrote in 1998 that CRT "first emerged as a counter-legal scholarship to the positivist and liberal legal discourse of civil rights."
In 2017, University of Alabama School of Law professor Richard Delgado, a co-founder of critical race theory, and legal writer Jean Stefancic define CRT as "a collection of activists and scholars interested in studying and transforming the relationship among race, racism, and power".
In 2021, Khiara Bridges, a law professor and author of the textbook Critical Race Theory: A Primer, defined critical race theory as an "intellectual movement", a "body of scholarship", and an "analytical tool-set for interrogating the relationship between law and racial inequality."
The 2021 Encyclopaedia Britannica described CRT as an "intellectual and social movement and loosely organized framework of legal analysis based on the premise that race is not a natural, biologically grounded feature of physically distinct subgroups of human beings but a socially constructed (culturally invented) category that is used to oppress and exploit people of colour."
TenetsScholars of CRT say that race is not "biologically grounded and natural"; rather, it is a socially constructed category used to oppress and exploit people of color; and that racism is not an aberration, but a normalized feature of American society. According to CRT, negative stereotypes assigned to members of minority groups benefit white people and increase racial oppression.
Individuals can belong to a number of different identity groups. The concept of intersectionality—one of CRT's main concepts—was introduced by legal scholar Kimberlé Crenshaw.
Derrick Albert Bell Jr. (1930 – 2011), an American lawyer, professor, and civil rights activist, writes that racial equality is "impossible and illusory" and that racism in the U.S. is permanent. According to Bell, civil-rights legislation will not on its own bring about progress in race relations; alleged improvements or advantages to people of color "tend to serve the interests of dominant white groups", in what Bell calls "interest convergence".
These changes do not typically affect—and at times even reinforce—racial hierarchies. This is representative of the shift in the 1970s, in Bell's re-assessment of his earlier desegregation work as a civil rights lawyer. He was responding to the Supreme Court's decisions that had resulted in the re-segregation of schools.
The concept of standpoint theory became particularly relevant to CRT when it was expanded to include a black feminist standpoint by Patricia Hill Collins. First introduced by feminist sociologists in the 1980s, standpoint theory holds that people in marginalized groups, who share similar experiences, can bring a collective wisdom and a unique voice to discussions on decreasing oppression.
In this view, insights into racism can be uncovered by examining the nature of the U.S. legal system through the perspective of the everyday lived experiences of people of color.
According to Encyclopedia Britannica, tenets of CRT have spread beyond academia, and are used to deepen understanding of socio-economic issues such as "poverty, police brutality, and voting rights violations", that are impacted by the ways in which race and racism are "understood and misunderstood" in the United States.
Common themes:
Richard Delgado and Jean Stefancic published an annotated bibliography of CRT references in 1993, listing works of legal scholarship that addressed one or more of the following themes:
- "critique of liberalism";
- "storytelling/counterstorytelling and 'naming one's own reality'";
- "revisionist interpretations of American civil rights law and progress";
- "a greater understanding of the underpinnings of race and racism";
- "structural determinism";
- "race, sex, class, and their intersections";
- "essentialism and anti-essentialism";
- "cultural nationalism/separatism";
- "legal institutions, critical pedagogy, and minorities in the bar";
- and "criticism and self-criticism".
When Gloria Ladson-Billings introduced CRT into education in 1995, she cautioned that its application required a "thorough analysis of the legal literature upon which it is based".
Critique of liberalism:
First and foremost to CRT legal scholars in 1993 was their "discontent" with the way in which liberalism addressed race issues in the U.S. They critiqued "liberal jurisprudence", including affirmative action, color-blindness, role modeling, and the merit principle.
Specifically, they claimed that the liberal concept of value-neutral law contributed to maintenance of the U.S.'s racially unjust social order.
An example questioning foundational liberal conceptions of Enlightenment values, such as rationalism and progress, is Rennard Strickland's 1986 Kansas Law Review article, "Genocide-at-Law: An Historic and Contemporary View of the Native American Experience". In it, he "introduced Native American traditions and world-views" into law school curriculum, challenging the entrenchment at that time of the "contemporary ideas of progress and enlightenment".
He wrote that U.S. laws that "permeate" the everyday lives of Native Americans were in "most cases carried out with scrupulous legality" but still resulted in what he called "cultural genocide".
In 1993, David Theo Goldberg described how countries that adopt classical liberalism's concepts of "individualism, equality, and freedom"—such as the United States and European countries—conceal structural racism in their cultures and languages, citing terms such as "Third World" and "primitive".
In 1988, Kimberlé Williams Crenshaw traced the origins of the New Right's use of the concept of color-blindness from 1970s neoconservative think tanks to the Ronald Reagan administration in the 1980s. She described how prominent figures such as neoconservative scholars Thomas Sowell and William Bradford Reynolds, who served as Assistant Attorney General for the Civil Rights Division from 1981 to 1988, called for "strictly color-blind policies".
Sowell and Reynolds, like many conservatives at that time, believed that the goal of equality of the races had already been achieved, and therefore the race-specific civil rights movement was a "threat to democracy". The color-blindness logic used in "reverse discrimination" arguments in the post-civil rights period is informed by a particular viewpoint on "equality of opportunity", as adopted by Sowell, in which the state's role is limited to providing a "level playing field", not to promoting equal distribution of resources.
Crenshaw claimed that "equality of opportunity" in antidiscrimination law can have both an expansive and a restrictive aspect. Crenshaw wrote that formally color-blind laws continue to have racially discriminatory outcomes.
According to her, this use of formal color-blindness rhetoric in claims of reverse discrimination, as in the 1978 Supreme Court ruling on Bakke, was a response to the way in which the courts had aggressively imposed affirmative action and busing during the Civil Rights era, even on those who were hostile to those issues.
In 1990, legal scholar Duncan Kennedy described the dominant approach to affirmative action in legal academia as "colorblind meritocratic fundamentalism". He called for a postmodern "race consciousness" approach that included "political and cultural relations" while avoiding "racialism" and "essentialism".
Sociologist Eduardo Bonilla-Silva describes this newer, subtle form of racism as "color-blind racism", which uses frameworks of abstract liberalism to decontextualize race, naturalize outcomes such as segregation in neighborhoods, attribute certain cultural practices to race, and cause "minimization of racism".
In his influential 1984 article, Delgado challenged the liberal concept of meritocracy in civil rights scholarship. He questioned how the top articles in most well-established journals were all written by white men.
Storytelling/counter-storytelling and "naming one's own reality":
The use of narrative (storytelling) to illuminate and explore lived experiences of racial oppression.
One of the prime tenets of liberal jurisprudence is that people can create appealing narratives to think and talk about greater levels of justice. Delgado and Stefancic call this the empathic fallacy—the belief that it is possible to "control our consciousness" by using language alone to overcome bigotry and narrow-mindedness.
They examine how people of color, considered outsiders in mainstream US culture, are portrayed in media and law through stereotypes and stock characters that have been adapted over time to shield the dominant culture from discomfort and guilt.
For example, slaves in the 18th-century Southern States were depicted as childlike and docile; Harriet Beecher Stowe adapted this stereotype through her character Uncle Tom, depicting him as a "gentle, long-suffering", pious Christian.
Following the Civil War, the African-American woman was depicted as a wise, care-giving "Mammy" figure.
During the Reconstruction period, African-American men were stereotyped as "brutish and bestial", a danger to white women and children. This was exemplified in Thomas Dixon Jr.'s novels, used as the basis for the epic film The Birth of a Nation, which celebrated the Ku Klux Klan and lynching.
During the Harlem Renaissance, African-Americans were depicted as "musically talented" and "entertaining". Following World War II, when many Black veterans joined the nascent civil rights movement, African Americans were portrayed as "cocky [and] street-smart", the "unreasonable, opportunistic" militant, the "safe, comforting, cardigan-wearing" TV sitcom character, and the "super-stud" of blaxploitation films.
The empathic fallacy informs the "time-warp aspect of racism", where the dominant culture can see racism only through the hindsight of a past era or distant land, such as South Africa.
Through centuries of stereotypes, racism has become normalized; it is a "part of the dominant narrative we use to interpret experience". Delgado and Stefancic argue that speech alone is an ineffective tool to counter racism, since the system of free expression tends to favor the interests of powerful elites and to assign responsibility for racist stereotypes to the "marketplace of ideas".
In the decades following the passage of civil rights laws, acts of racism had become less overt and more covert—invisible to, and underestimated by, most of the dominant culture.
Since racism makes people feel uncomfortable, the empathic fallacy helps the dominant culture to mistakenly believe that it no longer exists, and that dominant images, portrayals, stock characters, and stereotypes—which usually portray minorities in a negative light—provide them with a true image of race in America.
Based on these narratives, the dominant group has no need to feel guilty or to make an effort to overcome racism, as it feels "right, customary, and inoffensive to those engaged in it", while self-described liberals who uphold freedom of expression can feel virtuous while maintaining their own superior position.
Bryan Brayboy has emphasized the epistemic importance of storytelling in Indigenous-American communities as superseding that of theory, and has proposed a --Tribal Critical Race Theory (TribCrit).
Standpoint epistemology:
The view that a members of racial minority groups have a unique authority and ability to speak about racism. This is seen as undermining dominant narratives relating to racial inequality, such as legal neutrality and personal responsibility or bootstrapping, through valuable first-hand accounts of the experience of racism.
Revisionist interpretations of American civil rights law and progress:
Interest convergence is a concept introduced by Derrick Bell in his 1980 Harvard Law Review article, "Brown v. Board of Education and the Interest-Convergence Dilemma".
In this article, Bell described how he re-assessed the impact of the hundreds of NAACP LDF de-segregation cases he won from 1960 to 1966, and how he began to believe that in spite of his sincerity at the time, anti-discrimination law had not resulted in improving Black children's access to quality education.
He listed and described how Supreme Court cases had gutted civil rights legislation, which had resulted in African-American students continuing to attend all-black schools that lacked adequate funding and resources. In examining these Supreme Court cases, Bell concluded that the only civil-rights legislation that was passed coincided with the self-interest of white people, which Bell termed interest convergence.
One of the best-known examples of interest convergence is the way in which American geopolitics during the Cold War in the aftermath of World War II was a critical factor in the passage of civil rights legislation by both Republicans and Democrats. Bell described this in numerous articles, including the aforementioned, and it was supported by the research and publications of legal scholar Mary L. Dudziak.
In her journal articles and her 2000 book Cold War Civil Rights—based on newly released documents—Dudziak provided detailed evidence that it was in the interest of the United States to quell the negative international press about treatment of African-Americans when the majority of the populations of newly decolonized countries which the U.S. was trying to attract to Western-style democracy, were not white.
The U.S. sought to promote liberal values throughout Africa, Asia, and Latin America to prevent the Soviet Union from spreading communism. Dudziak described how the international press widely circulated stories of segregation and violence against African-Americans.
The Moore's Ford lynchings, where a World War II veteran was lynched, were particularly widespread in the news. American allies followed stories of American racism through the international press, and the Soviets used stories of racism against Black Americans as a vital part of their propaganda.
Dudziak performed extensive archival research in the U.S. Department of State and Department of Justice and concluded that U.S. government support for civil-rights legislation "was motivated in part by the concern that racial discrimination harmed the United States' foreign relations".
When the National Guard was called in to prevent nine African-American students from integrating the Little Rock Central High School, the international press covered the story extensively. The then-Secretary of State told President Dwight Eisenhower that the Little Rock situation was "ruining" American foreign policy, particularly in Asia and Africa.
The U.S.'s ambassador to the United Nations told President Eisenhower that as two-thirds of the world's population was not white, he was witnessing their negative reactions to American racial discrimination. He suspected that the U.S. "lost several votes on the Chinese communist item because of Little Rock."
Intersectional theory:
The examination of race, sex, class, national origin, and sexual orientation, and how their intersections play out in various settings, such as how the needs of a Latina are different from those of a Black male, and whose needs are promoted.
These intersections provide a more holistic picture for evaluating different groups of people. Intersectionality is a response to identity politics insofar as identity politics does not take into account the different intersections of people's identities.
Essentialism vs. anti-essentialism:
Delgado and Stefancic write, "Scholars who write about these issues are concerned with the appropriate unit for analysis: Is the black community one, or many, communities? Do middle- and working-class African-Americans have different interests and needs?
Do all oppressed peoples have something in common?" This is a look at the ways that oppressed groups may share in their oppression but also have different needs and values that need to be analyzed differently. It is a question of how groups can be essentialized or are unable to be essentialized.
From a essentialist perspective, one's identity consists of an internal "essence" that is static and unchanging from birth, whereas a non-essentialist position holds that "the subject has no fixed or permanent identity.”
Racial essentialism diverges into biological and cultural essentialism, where subordinated groups may endorse one over the other. "Cultural and biological forms of racial essentialism share the idea that differences between racial groups are determined by a fixed and uniform essence that resides within and defines all members of each racial group. However, they differ in their understanding of the nature of this essence."
Subordinated communities may be more likely to endorse cultural essentialism as it provides a basis of positive distinction for establishing a cumulative resistance as a means to assert their identities and advocacy of rights, whereas biological essentialism may be unlikely to resonate with marginalized groups as historically, dominant groups have used genetics and biology in justifying racism and oppression.
Essentialism is the idea of a singular, shared experience between a specific group of people. Anti-essentialism, on the other hand, believes that there are other various factors that can affect a person’s being and their overall life experience. The race of an individual is viewed more as a social construct that does not necessarily dictate the outcome of their life circumstances.
Race is viewed as “a social and historical construction, rather than an inherent, fixed, essential biological characteristic.” Anti-essentialism “forces a destabilization in the very concept of race itself…” The results of this destabilization vary on the analytic focus falling into two general categories, “... consequences for the analytic concepts of racial identity or racial subjectivity.”
Structural determinism, and race, sex, class, and their intersections:
Exploration of how "the structure of legal thought or culture influences its content" in a way that determines social outcomes. Delgado and Stefancic cited "empathic fallacy" as one example of structural determinism—the "idea that our system, by reason of its structure and vocabulary, cannot redress certain types of wrong."
They interrogate the absence of terms such as intersectionality, anti-essentialism, and jury nullification in standard legal reference research tools in law libraries.
Cultural nationalism/separatism:
The exploration of more radical views that argue for separation and reparations as a form of foreign aid (including black nationalism).
Legal institutions, critical pedagogy, and minorities in the bar:
Camara Phyllis Jones defines institutionalized racism as "differential access to the goods, services, and opportunities of society by race. Institutionalized racism is normative, sometimes legalized and often manifests as inherited disadvantage.
It is structural, having been absorbed into our institutions of custom, practice, and law, so there need not be an identifiable offender. Indeed, institutionalized racism is often evident as inaction in the face of need, manifesting itself both in material conditions and in access to power.
With regard to the former, examples include differential access to quality education, sound housing, gainful employment, appropriate medical facilities, and a clean environment."
Black-white binary:
The black-white binary is a paradigm identified by legal scholars through which racial issues and histories are typically articulated within a racial binary between Black and white Americans.
The binary largely governs how race has been portrayed and addressed throughout U.S. history. Critical race theorists Richard Delgado and Jean Stefancic argue that anti-discrimination law has blindspots for non-black minorities due to its language being confined within the black-white binary.
Applications and adaptations:
Scholars of critical race theory have focused, with some particularity, on the issues of hate crime and hate speech. In response to the opinion of the U.S. Supreme Court in the hate speech case of R.A.V. v. City of St. Paul (1992), in which the Court struck down an anti-bias ordinance as applied to a teenager who had burned a cross, Mari Matsuda and Charles Lawrence argued that the Court had paid insufficient attention to the history of racist speech and the actual injury produced by such speech.
Critical race theorists have also argued in favor of affirmative action. They propose that so-called merit standards for hiring and educational admissions are not race-neutral and that such standards are part of the rhetoric of neutrality through which whites justify their disproportionate share of resources and social benefits.
In his 2009 article "Will the Real CRT Please Stand Up: The Dangers of Philosophical Contributions to CRT", Curry distinguished between the original CRT key writings and what is being done in the name of CRT by a "growing number of white feminists".
The new CRT movement "favors narratives that inculcate the ideals of a post-racial humanity and racial amelioration between compassionate (Black and White) philosophical thinkers dedicated to solving America's race problem."
They are interested in discourse (i.e., how individuals speak about race) and the theories of white Continental philosophers, over and against the structural and institutional accounts of white supremacy which were at the heart of the realist analysis of racism introduced in Derrick Bell's early works, and articulated through such African-American thinkers as W. E. B. Du Bois, Paul Robeson, and Judge Robert L. Carter.
History:
Early years:
Although the terminology critical race theory began in its application to laws, the subject emerges out of the broader frame of critical theory in how it analyzes power structures in society despite whatever laws may be in effect.
In the 1998 article, "Critical Race Theory: Past, Present, and Future", Delgado and Stefancic trace the origins of CRT to the early writings of Derrick Albert Bell Jr. including his 1976 Yale Law Journal article, "Serving Two Masters" and his 1980 Harvard Law Review article entitled "Brown v. Board of Education and the Interest-Convergence Dilemma".
In the 1970s, as a professor at Harvard Law School Bell began to critique, question and re-assess the civil rights cases he had litigated in the 1960s to desegregate schools following the passage of Brown v. Board of Education. This re-assessment became the "cornerstone of critical race theory".
Delgado and Stefancic, who together wrote Critical Race Theory: a Introduction in 2001, described Bell's "interest convergence" as a "means of understanding Western racial history". The focus on desegregation after the 1954 Supreme Court decision in Brown—declaring school segregation unconstitutional—left "civil-rights lawyers compromised between their clients' interests and the law".
The concern of many Black parents—for their children's access to better education—was being eclipsed by the interests of litigators who wanted a "breakthrough" in their "pursuit of racial balance in schools". In 1995, Cornel West said that Bell was "virtually the lone dissenter" writing in leading law reviews who challenged basic assumptions about how the law treated people of color.
In his Harvard Law Review articles, Bell cites the 1964 Hudson v. Leake County School Board case which the NAACP Legal Defense and Educational Fund (NAACP LDF) won, mandating that the all-white school board comply with desegregation. At that time it was seen as a success.
By the 1970s, White parents were removing their children from the desegregated schools and enrolling them in segregation academies. Bell came to believe that he had been mistaken in 1964 when, as a young lawyer working for the LDF, he had convinced Winson Hudson, who was the head of the newly formed local NAACP chapter in Harmony, Mississippi, to fight the all-White Leake County School Board to desegregate schools.
She and the other Black parents had initially sought LDF assistance to fight the board's closure of their school—one of the historic Rosenwald Schools for Black children.
Bell explained to Hudson, that—following Brown—the LDF could not fight to keep a segregated Black school open; they would have to fight for desegregation. In 1964, Bell and the NAACP had believed that resources for desegregated schools would be increased and Bell explained to Hudson, that—following Brown—the LDF could not fight to keep a segregated Black school open; they would have to fight for desegregation.
In 1964, Bell and the NAACP had believed that resources for desegregated schools would be increased and Black children would access higher quality education, since White parents would insist on better quality schools; by the 1970s, Black children were again attending segregated schools and the quality of education had deteriorated.
Bell began to work for the NAACP LDF shortly after the Montgomery bus boycott and the ensuing 1956 Supreme Court ruling following Browder v. Gayle that the Alabama and Montgomery bus segregation laws were unconstitutional. From 1960 to 1966 Bell successfully litigated 300 civil rights cases in Mississippi.
Bell was inspired by Thurgood Marshall, who had been one of the two leaders of a decades-long legal campaign starting in the 1930s, in which they filed hundreds of lawsuits to reverse the "separate but equal" doctrine announced by the Supreme Court's decision in Plessy v. Ferguson (1896).
The Court ruled that racial segregation laws enacted by the states were not in violation of the United States Constitution as long as the facilities for each race were equal in quality. The Plessy decision provided the legal mandate at the federal level to enforce Jim Crow laws that had been introduced by white Southern Democrats starting in the 1870s for racial segregation in all public facilities, including public schools.
The Court's 1954 Brown decision—which held that the "separate but equal" doctrine is unconstitutional in the context of public schools and educational facilities—severely weakened Plessy. The Supreme Court concept of constitutional colorblindness in regards to case evaluation began with Plessy.
Before Plessy, the Court considered color as a determining factor in many landmark cases, which reinforced Jim Crow laws. Bell's 1960s civil rights work built on Justice Marshall's groundwork begun in the 1930s. It was a time when the legal branch of the civil rights movement was launching thousands of civil rights cases. It was a period of idealism for the civil rights movement.
At Harvard, Bell developed new courses that studied American law through a racial lens. He compiled his own course materials which were published in 1970 under the title Race, Racism, and American Law. He became Harvard Law School's first Black tenured professor in 1971.
During the 1970s, the courts were using legislation to enforce affirmative action programs and busing—where the courts mandated busing to achieve racial integration in school districts that rejected desegregation. In response, in the 1970s, neoconservative think tanks—hostile to these two issues in particular—developed a color-blind rhetoric to oppose them, claiming they represented reverse discrimination.
In 1978, Regents of the University of California v. Bakke, when Bakke won this landmark Supreme Court case by using the argument of reverse racism, Bell's skepticism that racism would end increased. Justice Lewis F. Powell Jr. held that the "guarantee of equal protection cannot mean one thing when applied to one individual and something else when applied to a person of another color."
In a 1979 article, Bell asked if there were any groups of the White population that would be willing to suffer any disadvantage that might result from the implementation of a policy to rectify harms to Black people resulting from slavery, segregation, or discrimination.
Bell resigned in 1980 because of what he viewed as the university's discriminatory practices, became the dean at University of Oregon School of Law and later returned to Harvard as a visiting professor.
While he was absent from Harvard, his supporters organized protests against Harvard's lack of racial diversity in the curriculum, in the student body and in the faculty. The university had rejected student requests, saying no sufficiently qualified black instructor existed.
Legal scholar Randall Kennedy writes that some students had "felt affronted" by Harvard's choice to employ an "archetypal white liberal... in a way that precludes the development of black leadership".
One of these students was Kimberlé Crenshaw, who had chosen Harvard in order to study under Bell; she was introduced to his work at Cornell. Crenshaw organized the student-led initiative to offer an alternative course on race and law in 1981—based on Bell's course and textbook—where students brought in visiting professors, such as Charles Lawrence, Linda Greene, Neil Gotanda, and Richard Delgado, to teach chapter-by-chapter from Race, Racism, and American Law.
Critical race theory emerged as an intellectual movement with the organization of this boycott; CRT scholars included graduate law students and professors.
Alan Freeman was a founding member of the Critical Legal Studies (CLS) movement that hosted forums in the 1980s. CLS legal scholars challenged claims to the alleged value-neutral position of the law. They criticized the legal system's role in generating and legitimizing oppressive social structures which contributed to maintaining an unjust and oppressive class system.
Delgado and Stefancic cite the work of Alan Freeman in the 1970s as formative to critical race theory. In his 1978 Minnesota Law Review article Freeman reinterpreted, through a critical legal studies perspective, how the Supreme Court oversaw civil rights legislation from 1953 to 1969 under the Warren Court. He criticized the narrow interpretation of the law which denied relief for victims of racial discrimination.
In his article, Freeman describes two perspectives on the concept of racial discrimination: that of victim or perpetrator. Racial discrimination to the victim includes both objective conditions and the "consciousness associated with those objective conditions".
To the perpetrator, racial discrimination consists only of actions without consideration of the objective conditions experienced by the victims, such as the "lack of jobs, lack of money, lack of housing". Only those individuals who could prove they were victims of discrimination were deserving of remedies.
By the late 1980s, Freeman, Bell, and other CRT scholars left the CLS movement claiming it was too narrowly focused on class and economic structures while neglecting the role of race and race relations in American law.
Emergence as a movement:
In 1989, Kimberlé Crenshaw, Neil Gotanda, and Stephanie Phillips organized a workshop at the University of Wisconsin-Madison entitled "New Developments in Critical Race Theory". The organizers coined the term "Critical Race Theory" to signify an "intersection of critical theory and race, racism and the law."
Afterward, legal scholars began publishing a higher volume of works employing critical race theory, including more than "300 leading law review articles" and books. In 1990, Duncan Kennedy published his article on affirmative action in legal academia in the Duke Law Journal, and Anthony E. Cook published his article "Beyond Critical Legal Studies" in the Harvard Law Review.
In 1991, Patricia Williams published The Alchemy of Race and Rights, while Derrick Bell published Faces at the Bottom of the Well in 1992. Cheryl I. Harris published her 1993 Harvard Law Review article "Whiteness as Property" in which she described how passing led to benefits akin to owning property. In 1995, two dozen legal scholars contributed to a major compilation of key writings on CRT.
By the early 1990s, key concepts and features of CRT had emerged. Bell had introduced his concept of "interest convergence" in his 1973 article. He developed the concept of racial realism in a 1992 series of essays and book, Faces at the bottom of the well: the permanence of racism.
He said that Black people needed to accept that the civil rights era legislation would not on its own bring about progress in race relations; anti-Black racism in the U.S. was a "permanent fixture" of American society; and equality was "impossible and illusory" in the US. Crenshaw introduced the term intersectionality in the 1990s.
In 1995, pedagogical theorists Gloria Ladson-Billings and William F. Tate began applying the critical race theory framework in the field of education. In their 1995 article Ladson-Billings and Tate described the role of the social construction of white norms and interests in education. They sought to better understand inequities in schooling.
Scholars have since expanded work to explore issues including:
- school segregation in the U.S.;
- relations between race, gender, and academic achievement;
- pedagogy;
- and research methodologies.
As of 2002, over 20 American law schools and at least three non-American law schools offered critical race theory courses or classes.
Critical race theory is also applied in the fields of:
- education,
- political science,
- women's studies,
- ethnic studies,
- communication,
- sociology,
- and American studies.
Other movements developed that apply critical race theory to specific groups. These include the Latino-critical (LatCrit), queer-critical, and Asian-critical movements. These continued to engage with the main body of critical theory research, over time developing independent priorities and research methods.
CRT has also been taught internationally, including in the United Kingdom (UK) and Australia. According to educational researcher Mike Cole, the main proponents of CRT in the UK include David Gillborn, John Preston, and Namita Chakrabarty.
Philosophical foundations:
CRT scholars draw on the work of Antonio Gramsci, Sojourner Truth, Frederick Douglass, and W. E. B. DuBois. Bell shared Paul Robeson's belief that "Black self-reliance and African cultural continuity should form the epistemic basis of Blacks' worldview." Their writing is also informed by the 1960s and 1970s movements such as:
Critical race theory shares many intellectual commitments with critical theory, critical legal studies, feminist jurisprudence, and postcolonial theory.
University of Connecticut philosopher, Lewis Gordon, who has focused on postcolonial phenomenology, and race and racism, wrote that:
- CRT is notable for its use of postmodern poststructural scholarship,
- including an emphasis on "subaltern" or "marginalized" communities
- and the "use of alternative methodology in the expression of theoretical work,
- most notably their use of "narratives" and other literary techniques".
Standpoint theory, which has been adopted by some CRT scholars, emerged from the first wave of the women's movement in the 1970s. The main focus of feminist standpoint theory is epistemology—the study of how knowledge is produced. The term was coined by Sandra Harding, an American feminist theorist, and developed by Dorothy Smith in her 1989 publication, The Everyday World as Problematic: A Feminist Sociology.
Smith wrote that by studying how women socially construct their own everyday life experiences, sociologists could ask new questions. Patricia Hill Collins introduced black feminist standpoint—a collective wisdom of those who have similar perspectives in society which sought to heighten awareness to these marginalized groups and provide ways to improve their position in society.
Critical race theory draws on the priorities and perspectives of both critical legal studies (CLS) and conventional civil rights scholarship, while also sharply contesting both of these fields. UC Davis School of Law legal scholar Angela P. Harris, describes critical race theory as sharing "a commitment to a vision of liberation from racism through right reason" with the civil rights tradition.
CRT deconstructs some premises and arguments of legal theory and simultaneously holds that legally constructed rights are incredibly important. CRT scholars disagreed with the CLS anti-legal rights stance, nor did they wish to "abandon the notions of law" completely; CRT legal scholars acknowledged that some legislation and reforms had helped people of color.
As described by Derrick Bell, critical race theory in Harris' view is committed to "radical critique of the law (which is normatively deconstructionist) and... radical emancipation by the law (which is normatively reconstructionist)".
University of Edinburgh philosophy professor Tommy J. Curry says that by 2009, the CRT perspective on a race as a social construct was accepted by "many race scholars" as a "commonsense view" that race is not "biologically grounded and natural."
Social construct is a term from social constructivism, whose roots can be traced to the early science wars, instigated in part by Thomas Kuhn's 1962 The Structure of Scientific Revolutions.
Ian Hacking, a Canadian philosopher specializing in the philosophy of science, describes how social construction has spread through the social sciences. He cites the social construction of race as an example, asking how race could be "constructed" better.
Criticism:
Academic criticism:
According to the Encyclopaedia Britannica, aspects of CRT have been criticized by "legal scholars and jurists from across the political spectrum." Criticism of CRT has focused on its emphasis on storytelling, its critique of the merit principle and of objective truth, and its thesis of the voice of color.
Critics say it contains a "postmodernist-inspired skepticism of objectivity and truth", and has a tendency to interpret "any racial inequity or imbalance [...] as proof of institutional racism and as grounds for directly imposing racially equitable outcomes in those realms", according to Britannica. Proponents of CRT have also been accused of treating even well-meaning criticism of CRT as evidence of latent racism.
In a 1997 book, law professors Daniel A. Farber and Suzanna Sherry criticized CRT for basing its claims on personal narrative and for its lack of testable hypotheses and measurable data. CRT scholars including Crenshaw, Delgado, and Stefancic responded that such critiques represent dominant modes within social science which tend to exclude people of color.
Delgado and Stefancic wrote that "In these realms [social science and politics], truth is a social construct created to suit the purposes of the dominant group." Farber and Sherry have also argued that anti-meritocratic tenets in critical race theory, critical feminism, and critical legal studies may unintentionally lead to antisemitic and anti-Asian implications.
They write that the success of Jews and Asians within what critical race theorists posit to be a structurally unfair system may lend itself to allegations of cheating and advantage-taking.
In response, Delgado and Stefancic write that there is a difference between criticizing an unfair system and criticizing individuals who perform well inside that system.
Public controversies:
See also: 2020s controversies around critical race theory
Australia:
In June 2021, following media reports that the proposed national curriculum was "preoccupied with the oppression, discrimination and struggles of Indigenous Australians", the Australian Senate approved a motion tabled by right-wing senator Pauline Hanson calling on the federal government to reject CRT, despite it not being included in the curriculum.
Despite this, CRT is gaining increasing popularity in Australian academic circles, to investigate indigenous issues/studies, anti-Muslim racism and Black Africans' experiences.
United Kingdom:
Conservatives within the UK government began to criticize CRT in late 2020. Equalities Minister Kemi Badenoch, who is of Nigerian descent, said during a parliamentary debate to mark Black History Month, "We do not want to see teachers teaching their pupils about white privilege and inherited racial guilt [...]
Any school which teaches these elements of critical race theory, or which promotes partisan political views such as defunding the police without offering a balanced treatment of opposing views, is breaking the law."
In an open letter, 101 writers of the Black Writers' Guild denounced Badenoch for remarks about popular anti-racism books such as White Fragility and Why I'm No Longer Talking to White People About Race, made in an interview in The Spectator, in which she said, "many of these books—and, in fact, some of the authors and proponents of critical race theory—actually want a segregated society".
United States:
Critical race theory has stirred controversy in the United States for promoting the use of narrative in legal studies, advocating "legal instrumentalism" as opposed to ideal-driven uses of the law, and encouraging legal scholars to promote racial equity.
Before 1993, the term "critical race theory" was not part of public discourse. In the spring of that year, conservatives launched a campaign led by Clint Bolick to portray Lani Guinier—then-President Bill Clinton's nominee for Assistant Attorney General for Civil Rights—as a radical because of her connection to CRT.
Within months, Clinton had withdrawn the nomination, describing the effort to stop Guinier's appointment as "a campaign of right-wing distortion and vilification".
Amy E. Ansell writes that the logic of legal instrumentalism reached wide public reception in the O. J. Simpson murder case when attorney Johnnie Cochran "enacted a sort of applied CRT", selecting an African-American jury and urging them to acquit Simpson in spite of the evidence against him—a form of jury nullification.
Legal scholar Jeffrey Rosen calls this the "most striking example" of CRT's influence on the U.S. legal system. Law professor Margaret M. Russell responded to Rosen's assertion in the Michigan Law Review, saying that Cochran's "dramatic" and "controversial" courtroom "style and strategic sense" in the Simpson case resulted from his decades of experience as an attorney; it was not significantly influenced by CRT writings.
In 2010, a Mexican-American studies program in Tucson, Arizona, was halted because of a state law forbidding public schools from offering race-conscious education in the form of "advocat[ing] ethnic solidarity instead of the treatment of pupils as individuals". Certain books, including a primer on CRT, were banned from the curriculum.
Matt de la Peña's young-adult novel Mexican WhiteBoy was banned for "containing 'critical race theory'" according to state officials. The ban on ethnic-studies programs was later deemed unconstitutional on the grounds that the state showed discriminatory intent: "Both enactment and enforcement were motivated by racial animus", federal Judge A. Wallace Tashima ruled.
In the run-up to and aftermath of the 2020 U.S. presidential election, opposition to critical race theory was adopted as a campaign theme by Donald Trump and various conservative commentators on Fox News and right-wing talk radio shows.
In September 2020, after seeing a piece on Fox News in which conservative activist Christopher Rufo denounced CRT, Trump issued an executive order directing agencies of the United States federal government to cancel funding for programs that mention "white privilege" or "critical race theory", on the basis that it constituted "divisive, un-American propaganda" and that it was "racist".
In a speech on September 17, 2020, Trump denounced critical race theory and announced the formation of the 1776 Commission to promote "patriotic education". On January 20, 2021, Joe Biden rescinded Trump's order and dissolved the 1776 Commission.
Opposition to what was purported to be critical race theory was subsequently adopted as a major theme by several conservative think tanks and pressure groups, including the Heritage Foundation, the Idaho Freedom Foundation, the American Legislative Exchange Council and organizations funded by the Koch brothers.
According to The Washington Post, conservative lawmakers and activists have used the term as "a catchall phrase for nearly any examination of systemic racism". Rufo wrote on Twitter, "The goal is to have the public read something crazy in the newspaper and immediately think 'critical race theory'."
State-level legislation:
Main article: Censorship of school curricula in the United States
In early 2021, Republican-backed bills were introduced to restrict teaching about race, ethnicity, or slavery in public schools in several states, including:
Several of these bills specifically mention "critical race theory" or single out The New York Times' 1619 Project. CRT is taught at the university level, and public school teachers do not generally use the phrase "Critical Race Theory" or its legal frameworks.
In mid-April 2021, a bill was introduced in the Idaho Legislature that would effectively ban any educational entity from teaching or advocating "sectarianism", including critical race theory or other programs involving social justice. On May 4, 2021, the bill was signed into law by Governor Brad Little.
On June 10, 2021, the Florida Board of Education unanimously voted to ban public schools from teaching critical race theory at the urging of governor Ron DeSantis.
As of July 2021, 10 U.S. states have introduced bills or taken other steps that would restrict teaching critical race theory, and 26 others were in the process of doing so.
In June 2021, the American Association of University Professors, the American Historical Association, the Association of American Colleges and Universities, and PEN America released a joint statement stating their opposition to such legislation, and by August 2021, 167 professional organizations had signed onto the statement.
In August 2021, the Brookings Institution recorded that eight states—Idaho, Oklahoma, Tennessee, Texas, Iowa, New Hampshire, Arizona, and South Carolina—had passed regulation on the issue, though also noted that none of the bills that passed, with the exception of Idaho's, actually contained the words "critical race theory".
Brookings also noted that these laws often extend beyond race to discussions of gender.
Timothy D. Snyder, historian and professor at Yale University, has called these new state laws memory laws — "government actions designed to guide public interpretation of the past".
Early memory laws were intended to protect victim groups, such as from revisionism attempts by holocaust deniers, but most recently have been used by Russia to protect "the feelings of the powerful", then by Donald Trump's 1776 Report in January 2021, followed by Republican-led legislatures submitting these bills.
Snyder called the Idaho version "Kafkaesque in its censorship: It affirms freedom of speech and then bans divisive speech."
As of December 2021, 66 educational gag orders had been filed for the year in 26 state legislatures (12 bills had already been passed into law) that would inhibit teaching any race theory in schools, universities, or state agencies, by teachers, employers or contractors.
Penalties vary, but predominantly include loss of funding for schools and institutions. However, in some cases the bills mandate firing of employees.
On January 15, 2022, his first day in office, governor of Virginia Glenn Youngkin signed multiple executive orders, including barring the teaching of critical race theory in public schools.
Subfields:
Within critical race theory, various sub-groupings focus on issues and nuances unique to particular ethno-racial and/or marginalized communities. This includes the intersection of race with disability, ethnicity, gender, sexuality, class, or religion.
For examples,
- disability critical race studies (DisCrit),
- critical race feminism (CRF),
- Jewish Critical Race Theory (HebCrit, pronounced "Heeb"),
- Black Critical Race Theory (Black Crit),
- Latino critical race studies (LatCrit),
- Asian American critical race studies (AsianCrit),
- South Asian American critical race studies (DesiCrit),
- Quantitative Critical Race Theory (QuantCrit)
- and American Indian critical race studies (sometimes called TribalCrit).
CRT methodologies have also been applied to the study of white immigrant groups.
CRT has spurred some scholars to call for a second wave of whiteness studies, which is now a small offshoot known as Second Wave Whiteness (SWW). Critical race theory has also begun to spawn research that looks at understandings of race outside the United States.
Disability critical race theory:
Another offshoot field is disability critical race studies (DisCrit), which combines disability studies and CRT to focus on the intersection of disability and race.
Latino critical race theory:
Latino critical race theory (LatCRT or LatCrit) is a research framework that outlines the social construction of race as central to how people of color are constrained and oppressed in society.
Race scholars developed LatCRT as a critical response to the "problem of the color line" first explained by W. E. B. Du Bois. While CRT focuses on the Black–White paradigm, LatCRT has moved to consider other racial groups, mainly Chicana/Chicanos, as well as Latinos/as, Asians, Native Americans/First Nations, and women of color.
In Critical Race Counterstories along the Chicana/Chicano Educational Pipeline, Tara J. Yosso discusses how the constraint of POC can be defined. Looking at the differences between Chicana/o students, the tenets that separate such individuals are: the intercentricity of race and racism, the challenge of dominant ideology, the commitment to social justice, the centrality of experience knowledge, and the interdisciplinary perspective.
LatCRTs main focus is to advocate social justice for those living in marginalized communities (specifically Chicana/os), who are guided by structural arrangements that disadvantage people of color. Social institutions function as dispossessions, disenfranchisement, and discrimination over minority groups, while LatCRT seeks to give voice to those who are victimized.
In order to do so, LatCRT has created two common themes:
First, CRT proposes that white supremacy and racial power are maintained over time, a process that the law plays a central role in. Different racial groups lack the voice to speak in this civil society, and, as such, CRT has introduced a new critical form of expression, called the voice of color.
The voice of color is narratives and storytelling monologues used as devices for conveying personal racial experiences. These are also used to counter metanarratives that continue to maintain racial inequality. Therefore, the experiences of the oppressed are important aspects for developing a LatCRT analytical approach, and it has not been since the rise of slavery that an institution has so fundamentally shaped the life opportunities of those who bear the label of criminal.
Second, LatCRT work has investigated the possibility of transforming the relationship between law enforcement and racial power, as well as pursuing a project of achieving racial emancipation and anti-subordination more broadly. Its body of research is distinct from general critical race theory in that it emphasizes immigration theory and policy, language rights, and accent- and national origin-based forms of discrimination.
CRT finds the experiential knowledge of people of color and draws explicitly from these lived experiences as data, presenting research findings through storytelling, chronicles, scenarios, narratives, and parables.
Asian critical race theory:
Asian critical race theory looks at the influence of race and racism on Asian Americans and their experiences in the U.S. education system. Like Latino critical race theory, Asian critical race theory is distinct from the main body of CRT in its emphasis on immigration theory and policy.
Critical philosophy of race:
The Critical Philosophy of Race (CPR) is inspired by both Critical Legal Studies and Critical Race Theory's use of interdisciplinary scholarship. Both CLS and CRT explore the covert nature of mainstream use of "apparently neutral concepts, such as merit or freedom."
See also:
- Anti-bias curriculum
- Cultural hegemony
- Judicial aspects of race in the United States
- Institutional or systemic racism
- Racism in the United States
- Slavery in the United States
- White privilege
Slavery in the United States
- YouTube Video by Neil deGrasse Tyson "Calling Me a "Black Scientist" Ghettofies the Conversation"
- YouTube Video: Barack Obama - the legacy of the first African-American president
- YouTube Video of the Life and Legacy of Martin Luther King, Jr.
- TOP ROW (Left): 400 years since slavery: a timeline of American history | Race | The Guardian
- TOP ROW (Right): Voices of Freedom | Atlanta History Center
- BOTTOM ROW (Left): The Fifteenth Amendment prohibited discrimination in voting rights on the basis of race, color, or previous status (i.e. slavery). While the amendment was not all-encompassing in that women were not included, it was an extremely significant ruling in affirming the liberties of African American men. This print depicts a huge parade held in Baltimore, Maryland, on May 19, 1870, surrounded by portraits of abolitionists and scenes of African Americans exercising their rights. Thomas Kelly after James C. Beard, The 15th Amendment. Celebrated May 19th 1870, 1870. Library of Congress.
- BOTTOM ROW (Right): Juneteenth (June 19) is a US federal holiday recognizing the end of slavery and celebrating the achievements, sacrifice, and culture of people with African ancestry. On June 19, 1865, just after the US Civil War, news that slavery had ended finally reached the coast of Texas, sparking an annual commemorative event that has continued to this day. Although Juneteenth has a long history in the African-American community, celebrations in wider circles are more recent. Increased focus on race-relations in 2020 prompted this article for our English-learning friends around the world.
Slavery in the United States
The legal institution of human chattel slavery, comprising the enslavement primarily of Africans and African Americans, was prevalent in the United States of America from its founding in 1776 until 1865, predominantly in the South. Slavery was established throughout European colonization in the Americas.
From 1526, during the early colonial period, it was practiced in what became Britain's colonies, including the Thirteen Colonies that formed the United States. Under the law, an enslaved person was treated as property that could be bought, sold, or given away.
Slavery lasted in about half of U.S. states until abolition in 1865, and issues concerning slavery seeped into every aspect of national politics, economics, and social custom. In the decades after the end of Reconstruction in 1877, many of slavery's economic and social functions were continued through segregation, sharecropping, and convict leasing.
By the time of the American Revolutionary War (1775–1783), the status of enslaved people had been institutionalized as a racial caste associated with African ancestry. During and immediately following the Revolution, abolitionist laws were passed in most Northern states and a movement developed to abolish slavery.
The role of slavery under the United States Constitution (1789) was the most contentious issue during its drafting. Although the creators of the Constitution never used the word "slavery", the final document, through the three-fifths clause, gave slave owners disproportionate political power by augmenting the congressional representation and the Electoral College votes of slaveholding states.
The Fugitive Slave Clause of the Constitution--Article IV, Section 2, Clause 3—provided that, if a slave escaped to another state, the other state had to return the slave to his or her master. This clause was implemented by the Fugitive Slave Act of 1793, passed by Congress.
All Northern states had abolished slavery in some way by 1805; sometimes, abolition was a gradual process, a few hundred people were enslaved in the Northern states as late as the 1840 census. Some slaveowners, primarily in the Upper South, freed their slaves, and philanthropists and charitable groups bought and freed others.
The Atlantic slave trade was outlawed by individual states beginning during the American Revolution. The import trade was banned by Congress in 1808, although smuggling was common thereafter. It has been estimated that about 30% of congressmen who were born before 1840 were, at some time in their lives, owners of slaves.
The rapid expansion of the cotton industry in the Deep South after the invention of the cotton gin greatly increased demand for slave labor, and the Southern states continued as slave societies.
The United States became ever more polarized over the issue of slavery, split into slave and free states. Driven by labor demands from new cotton plantations in the Deep South, the Upper South sold more than a million slaves who were taken to the Deep South. The total slave population in the South eventually reached four million.
As the United States expanded, the Southern states attempted to extend slavery into the new western territories to allow proslavery forces to maintain their power in the country. The new territories acquired by the Louisiana Purchase and the Mexican Cession were the subject of major political crises and compromises.
By 1850, the newly rich, cotton-growing South was threatening to secede from the Union, and tensions continued to rise. Bloody fighting broke out over slavery in the Kansas Territory. Slavery was defended in the South as a "positive good", and the largest religious denominations split over the slavery issue into regional organizations of the North and South.
When Abraham Lincoln won the 1860 election on a platform of halting the expansion of slavery, seven slave states seceded to form the Confederacy.
Shortly afterward, on April 12, 1861, the Civil War began when Confederate forces attacked the U.S. Army's Fort Sumter in Charleston, South Carolina. Four additional slave states then joined the Confederacy after Lincoln, on April 15, called forth in response "the militia of the several States of the Union, to the aggregate number of seventy-five thousand, in order to suppress" the rebellion.
During the war some jurisdictions abolished slavery and, due to Union measures such as the Confiscation Acts and the Emancipation Proclamation, the war effectively ended slavery in most places.
After the Union victory, the Thirteenth Amendment to the United States Constitution was ratified on December 6, 1865, prohibiting "slavery [and] involuntary servitude, except as a punishment for crime."
Background:
Main articles:
- Slavery in the colonial history of the United States,
- Slavery among Native Americans in the United States,
- History of unfree labor in the United States
- Atlantic slave trade;
- Slavery in New France;
- Timeline of events leading to the American Civil War § Colonial period, 1607–1775
- Indentured servitude in British America,
- Indentured servitude in Pennsylvania,
- Indentured servitude in Virginia,
- Engagé system in Louisiana.
During most of the British colonial period, slavery existed in all the colonies. People enslaved in the North typically worked as house servants, artisans, laborers and craftsmen, with the greater number in cities. Many men worked on the docks and in shipping. In 1703, more than 42 percent of New York City households held enslaved people in bondage, the second-highest proportion of any city in the colonies, behind only Charleston, South Carolina.
Enslaved people were also used as agricultural workers in farm communities, especially in the South, but also in upstate New York and Long Island, Connecticut, and New Jersey.
By 1770, there were 397,924 blacks out of a population of 2.17 million in what would soon become the United States. The slaves of the colonial era were unevenly distributed:
- 14,867 lived in New England, where they were three percent of the population;
- 34,679 lived in the mid-Atlantic colonies, where they were six percent of the population;
- and 347,378 in the five Southern Colonies, where they were 31 percent of the population.
The South developed an agricultural economy dependent on commodity crops. Its planters rapidly acquired a significantly higher number and proportion of enslaved people in the population overall, as its commodity crops were labor-intensive.
Early on, enslaved people in the South worked primarily on farms and plantations growing indigo, rice and tobacco (cotton did not become a major crop until after the 1790s).
In 1720, about 65 percent of South Carolina's population was enslaved. Planters (defined by historians in the Upper South as those who held 20 or more slaves) used enslaved workers to cultivate commodity crops. They also worked in the artisanal trades on large plantations and in many Southern port cities. The later wave of settlers in the 18th century who settled along the Appalachian Mountains and backcountry were backwoods subsistence farmers, and they seldom held enslaved people.
Beginning in the second half of the 18th century, a debate emerged over the continued importation of African slaves to the American colonies. Many in the colonial elite, including the Southern slavocracy, opposed the further importation of slaves due to fears that it would destabilize slavery and lead to further slave rebellions.
In 1772, prominent Virginians submitted a petition to the Crown, requesting that the slave trade to Virginia be abolished; it was rejected. Rhode Island forbade the importation of slaves in 1774. All of the colonies except Georgia had banned or limited any such importations by 1786; Georgia did so in 1798. Some of these laws were later repealed.
Slavery in the American Revolution and early republic:
Main articles: Further information:
- Timeline of events leading to the American Civil War § American Revolution and Confederation period, 1776–1787;
- Timeline of events leading to the American Civil War § Early Constitutional period, 1787–1811
As historian Christopher L. Brown put it, slavery "had never been on the agenda in a serious way before", but the American Revolution "forced it to be a public question from there forward".
After the new country's independence was secure, slavery was a topic of contention at the 1787 Constitutional Convention. Many of the Founding Fathers of the United States were plantation owners who owned large numbers of enslaved laborers; the original Constitution preserved their right to own slaves, and they further gained a political advantage in owning slaves.
Although the enslaved of the early Republic were considered sentient property, were not permitted to vote, and had no rights to speak of, they were to be enumerated in population censuses and counted as three-fifths of a person for the purposes of representation in the national legislature, the U.S. Congress.
Slaves and free blacks who supported the Continental Army:
Main article: Black Patriots
The rebels began to offer freedom as an incentive to motivate slaves to fight on their side. Washington authorized slaves to be freed who fought with the American Continental Army. Rhode Island started enlisting slaves in 1778, and promised compensation to owners whose slaves enlisted and survived to gain freedom.
During the course of the war, about one-fifth of the Northern army was black. In 1781, Baron Closen, a German officer in the French Royal Deux-Ponts Regiment at the Battle of Yorktown, estimated the American army to be about one-quarter black. These men included both former slaves and free-born blacks. Thousands of free blacks in the Northern states fought in the state militias and Continental Army. In the South, both sides offered freedom to slaves who would perform military service. Roughly 20,000 slaves fought in the American Revolution.
Black Loyalists:
Main articles: Black Loyalist and Dunmore's Proclamation
See also: Book of Negroes
After the Revolutionary War broke out, the British realized they lacked the manpower necessary to prosecute the war. In response, British commanders began issuing proclamations to Patriot-owned slaves, offering freedom if they fled to British lines and assisted the British war effort. Such proclamations, which were repeatedly issued over the course of the conflict, which resulted in up to 100,000 American slaves fleeing to British lines.
Self-emancipated slaves who reached British lines were organized into a variety of military units which served in all theaters of the war. Formerly enslaved women and children, in lieu of military service, worked instead as laborers and domestic servants. At the end of the war, freed slaves in British lines either evacuated to other British colonies or to Britain itself, were re-enslaved by the victorious Americans or fled into the countryside.
In early 1775, the royal governor of Virginia, Lord Dunmore, wrote to the Earl of Dartmouth of his intention to free slaves owned by American Patriots in case they staged a rebellion.
On November 7, 1775, Dunmore issued Dunmore's Proclamation, which had promised freedom to any slaves of American patriots who would leave their masters and join the British forces. Historians agree that the proclamation was chiefly designed for practical rather than moral reasons, and slaves owned by American Loyalists were unaffected by the proclamation.
About 1,500 slaves owned by patriots escaped and joined Dunmore's forces. A total of 18 slaves fled George Washington's plantation, one of whom, Harry, served in Dunmore's all-black loyalist regiment called "the Black Pioneers." Escapees who joined Dunmore had "Liberty to Slaves" stitched on to their jackets. Most died of disease before they could do any fighting, but three hundred of these freed slaves made it to freedom in Britain.
Historian Jill Lepore writes that "between eighty and a hundred thousand (nearly one in five black slaves) left their homes ... betting on British victory", but Cassandra Pybus states that between 20,000 and 30,000 is a more realistic number of slaves who defected to the British side during the war.
Many slaves took advantage of the disruption of war to escape from their plantations to British lines or to fade into the general population. Upon their first sight of British vessels, thousands of slaves in Maryland and Virginia fled from their owners.
Throughout the South, losses of slaves were high, with many due to escapes. Slaves also escaped throughout New England and the mid-Atlantic, with many joining the British who had occupied New York.
In the closing months of the war, the British evacuated freedmen and also removed slaves owned by loyalists. Around 15,000 black loyalists left with the British, most of them ending up as free people in England or its colonies.
Washington hired a slave catcher during the war, and at its end he pressed the British to return the slaves to their masters. With the British certificates of freedom in their belongings, the black loyalists, including Washington's slave Harry, sailed with their white counterparts out of New York harbor to Nova Scotia. More than 3,000 were resettled in Nova Scotia, where they were eventually granted land and formed the community of the black Nova Scotians.
Early abolitionism in the United States:
Main article: Abolitionism in the United States
In the first two decades after the American Revolution, state legislatures and individuals took actions to free slaves. Northern states passed new constitutions that contained language about equal rights or specifically abolished slavery; some states, such as New York and New Jersey, where slavery was more widespread, passed laws by the end of the 18th century to abolish slavery incrementally.
By 1804, all the Northern states had passed laws outlawing slavery, either immediately or over time. In New York, the last slaves were freed in 1827 (celebrated with a big July 4 parade). Indentured servitude, which had been widespread in the colonies (half the population of Philadelphia had once been indentured servants), dropped dramatically, and disappeared by 1800.
However, there were still forcibly indentured servants in New Jersey in 1860. No Southern state abolished slavery, but some individual owners, more than a handful, freed their slaves by personal decision, often providing for manumission in wills but sometimes filing deeds or court papers to free individuals.
Numerous slaveholders who freed their slaves cited revolutionary ideals in their documents; others freed slaves as a promised reward for service. From 1790 to 1810, the proportion of blacks free in the United States increased from 8 to 13.5 percent, and in the Upper South from less than one to nearly ten percent as a result of these actions.
Starting in 1777, the rebels outlawed the importation of slaves state by state. They all acted to end the international trade, but, after the war, it was reopened in South Carolina and Georgia.
In 1807, the United States Congress acted on President Thomas Jefferson's advice and, without controversy, made importing slaves from abroad a federal crime, effective the first day that the United States Constitution permitted this prohibition: January 1, 1808.
During the Revolution and in the following years, all states north of Maryland took steps towards abolishing slavery. In 1777, the Vermont Republic, which was still unrecognized by the United States, passed a state constitution prohibiting slavery.
The Pennsylvania Abolition Society, led in part by Benjamin Franklin, was founded in 1775, and Pennsylvania began gradual abolition in 1780. In 1783, the Supreme Judicial Court of Massachusetts ruled in Commonwealth v. Jennison that slavery was unconstitutional under the state's new 1780 constitution.
New Hampshire began gradual emancipation in 1783, while Connecticut and Rhode Island followed suit in 1784. The New York Manumission Society, which was led by John Jay, Alexander Hamilton and Aaron Burr, was founded in 1785. New York state began gradual emancipation in 1799, and New Jersey did the same in 1804.
Shortly after the Revolution, the Northwest Territory was established, by Manasseh Cutler and Rufus Putnam (who had been George Washington's chief engineer). Both Cutler and Putnam came from Puritan New England.
The Puritans strongly believed that slavery was morally wrong. Their influence on the issue of slavery was long-lasting, and this was provided significantly greater impetus by the Revolution.
The Northwest Territory (which became Ohio, Michigan, Indiana, Illinois, Wisconsin and part of Minnesota) doubled the size of the United States, and it was established at the insistence of Cutler and Putnam as "free soil" – no slavery. This was to prove crucial a few decades later. Had those states been slave states, and their electoral votes gone to Abraham Lincoln's main opponent, Lincoln would not have become president. The Civil War would not have been fought. Even if it eventually had been, the North might well have lost.
Constitution of the United States:
Main article: Slavery and the United States Constitution
Further information: Fugitive Slave Clause
Slavery was a contentious issue in the writing and approval of the Constitution of the United States. The words "slave" and "slavery" did not appear in the Constitution as originally adopted, although several provisions clearly referred to slaves and slavery.
Until the adoption of the 13th Amendment in 1865, the Constitution did not prohibit slavery.
Section 9 of Article I forbade the federal government from prohibiting the importation of slaves, described as "such Persons as any of the States now existing shall think proper to admit", for twenty years after the Constitution's ratification (until January 1, 1808).
The Act Prohibiting Importation of Slaves of 1807, passed by Congress and signed into law by President Thomas Jefferson (who had called for its enactment in his 1806 State of the Union address), went into effect on January 1, 1808, the earliest date on which the importation of slaves could be prohibited under the Constitution.
The delegates approved the Fugitive Slave Clause of the Constitution (Article IV, section 2, clause 3), which prohibited states from freeing slaves who fled to them from another state and required that they be returned to their owners. The Fugitive Slave Act of 1793 and the Fugitive Slave Act of 1850 gave effect to the Fugitive Slave Clause. Salmon P. Chase considered the Fugitive Slave Acts unconstitutional because "The Fugitive Slave Clause was a compact among the states, not a grant of power to the federal government".
Three-fifths Compromise:
Main article: Three-fifths Compromise
In a section negotiated by James Madison of Virginia, Section 2 of Article I designated "other persons" (slaves) to be added to the total of the state's free population, at the rate of three-fifths of their total number, to establish the state's official population for the purposes of apportionment of congressional representation and federal taxation.
The "Three-Fifths Compromise" was reached after a debate in which delegates from Southern (slaveholding) states argued that slaves should be counted in the census just as all other persons were while delegates from Northern (free) states countered that slaves should not be counted at all.
The compromise strengthened the political power of Southern states, as three-fifths of the (non-voting) slave population was counted for congressional apportionment and in the Electoral College, although it did not strengthen Southern states as much as it would have had the Constitution provided for counting all persons, whether slave or free, equally.
In addition, many parts of the country were tied to the Southern economy. As the historian James Oliver Horton noted, prominent slaveholder politicians and the commodity crops of the South had a strong influence on United States politics and economy. Horton said,
in the 72 years between the election of George Washington and the election of Abraham Lincoln, 50 of those years [had] a slaveholder as president of the United States, and, for that whole period of time, there was never a person elected to a second term who was not a slaveholder.
The power of Southern states in Congress lasted until the Civil War, affecting national policies, legislation, and appointments.
One result was that most of the justices appointed to the Supreme Court were slave owners. The planter elite dominated the Southern congressional delegations and the United States presidency for nearly fifty years.
Slavery in the 19th Century:
See also: Main articles: Further information:
- American slave court cases,
- Fugitive slaves in the United States,
- Timeline of events leading to the American Civil War § 1812 1849,
- Female slavery in the United States
- Slave labor on United States military installations 1799–1863
- Slavery at American colleges and universities
Slavery in the United States was a variable thing, in "constant flux, driven by the violent pursuit of ever-larger profits." According to demographic calculations by J. David Hacker of the University of Minnesota, approximately four out of five of all of the slaves who ever lived in the United States or the territory that became the United States (beginning in 1619 and including all colonies that were eventually acquired or conquered by the United States) were born in or imported to the United States in the 19th century.
Slaves were the labor force of the South, but slave ownership was also the foundation upon which American white supremacy was constructed. Historian Walter Johnson argues that "one of the many miraculous things a slave could do was make a household white...", meaning that the value of whiteness in America was in some ways measured by the ability to purchase and maintain black slaves.
Harriet Beecher Stowe described slavery in the United States in 1853:
What, then, is American slavery, as we have seen it exhibited by law, and by the decision of Courts? Let us begin by stating what it is not:
1. It is not apprenticeship.
2. It is not guardianship.
3. It is in no sense a system for the education of a weaker race by a stronger.
4. The happiness of the governed is in no sense its object.
5. The temporal improvement or the eternal well-being of the governed is in no sense its object.
The object of it has been distinctly stated in one sentence by Judge Ruffin,— "The end is the profit of the master, his security, and the public safety." Slavery, then, is absolute despotism, of the most unmitigated form.
Justifications in the South:
Further information: See also:
- Field slaves in the United States,
- Gang system,
- Task system,
- Plantation complexes in the Southern United States,
- American gentry,
- Planter class,
- List of plantations in the United States
American slavery as "a necessary evil"In the 19th century, proponents of slavery often defended the institution as a "necessary evil". At that time, it was feared that emancipation of black slaves would have more harmful social and economic consequences than the continuation of slavery.
On April 22, 1820, Thomas Jefferson, one of the Founding Fathers of the United States, wrote in a letter to John Holmes, that with slavery, "We have the wolf by the ear, and we can neither hold him, nor safely let him go. Justice is in one scale, and self-preservation in the other."
The French writer and traveler Alexis de Tocqueville, in his influential Democracy in America (1835), expressed opposition to slavery while observing its effects on American society:
- He felt that a multiracial society without slavery was untenable, as he believed that prejudice against blacks increased as they were granted more rights (for example, in Northern states).
- He believed that the attitudes of white Southerners, and the concentration of the black population in the South, were bringing the white and black populations to a state of equilibrium, and were a danger to both races.
- Because of the racial differences between master and slave, he believed that the latter could not be emancipated.
In a letter to his wife dated December 27, 1856, in reaction to a message from President Franklin Pierce, Robert E. Lee wrote:
- "There are few, I believe, in this enlightened age, who will not acknowledge that slavery as an institution is a moral and political evil.
- It is idle to expatiate on its disadvantages. I think it is a greater evil to the white than to the colored race.
- While my feelings are strongly enlisted in behalf of the latter, my sympathies are more deeply engaged for the former.
- The blacks are immeasurably better off here than in Africa, morally, physically, and socially.
- The painful discipline they are undergoing is necessary for their further instruction as a race, and will prepare them, I hope, for better things. How long their servitude may be necessary is known and ordered by a merciful Providence."
American slavery as "a positive good":
Main article: Slavery as a positive good in the United States
See also: Mudsill theory
However, as the abolitionist movement's agitation increased and the area developed for plantations expanded, apologies for slavery became more faint in the South. Leaders then described slavery as a beneficial scheme of labor management.
John C. Calhoun, in a famous speech in the Senate in 1837, declared that slavery was "instead of an evil, a good – a positive good". Calhoun supported his view with the following reasoning:
- in every civilized society one portion of the community must live on the labor of another;
- learning, science, and the arts are built upon leisure;
- the African slave, kindly treated by his master and mistress and looked after in his old age, is better off than the free laborers of Europe;
- and under the slave system conflicts between capital and labor are avoided.
- The advantages of slavery in this respect, he concluded, "will become more and more manifest, if left undisturbed by interference from without, as the country advances in wealth and numbers".
South Carolina army officer, planter and railroad executive James Gadsden called slavery "a social blessing" and abolitionists "the greatest curse of the nation". Gadsden was in favor of South Carolina's secession in 1850, and was a leader in efforts to split California into two states, one slave and one free.
Other Southern writers who also began to portray slavery as a positive good were James Henry Hammond and George Fitzhugh. They presented several arguments to defend the practice of slavery in the South.
Hammond, like Calhoun, believed that slavery was needed to build the rest of society. In a speech to the Senate on March 4, 1858, Hammond developed his "Mudsill Theory," defending his view on slavery by stating:
- "Such a class you must have, or you would not have that other class which leads progress, civilization, and refinement.
- It constitutes the very mud-sill of society and of political government;
- and you might as well attempt to build a house in the air, as to build either the one or the other, except on this mud-sill."
Hammond believed that in every class one group must accomplish all the menial duties, because without them the leaders in society could not progress. He argued that the hired laborers of the North were slaves too: "The difference ... is, that our slaves are hired for life and well compensated; there is no starvation, no begging, no want of employment," while those in the North had to search for employment.
George Fitzhugh used assumptions about white superiority to justify slavery, writing that, "the Negro is but a grown up child, and must be governed as a child."
In The Universal Law of Slavery, Fitzhugh argues that slavery provides everything necessary for life and that the slave is unable to survive in a free world because he is lazy, and cannot compete with the intelligent European white race.
He states that "The negro slaves of the South are the happiest, and in some sense, the freest people in the world." Without the South, "He (slave) would become an insufferable burden to society" and "Society has the right to prevent this, and can only do so by subjecting him to domestic slavery."
On March 21, 1861, Alexander Stephens, Vice President of the Confederacy, delivered his Cornerstone Speech. He explained the differences between the Constitution of the Confederate States and the United States Constitution, laid out the cause for the American Civil War, as he saw it, and defended slavery:
- The new [Confederate] Constitution has put at rest forever all the agitating questions relating to our peculiar institutions – African slavery as it exists among us – the proper status of the negro in our form of civilization.
- This was the immediate cause of the late rupture and present revolution. Jefferson, in his forecast, had anticipated this, as the "rock upon which the old Union would split." He was right.
- What was conjecture with him, is now a realized fact. But whether he fully comprehended the great truth upon which that rock stood and stands, may be doubted.
- The prevailing ideas entertained by him and most of the leading statesmen at the time of the formation of the old Constitution were, that the enslavement of the African was in violation of the laws of nature; that it was wrong in principle, socially, morally and politically.
- It was an evil they knew not well how to deal with; but the general opinion of the men of that day was, that, somehow or other, in the order of Providence, the institution would be evanescent and pass away ... Those ideas, however, were fundamentally wrong.
- They rested upon the assumption of the equality of races. This was an error. It was a sandy foundation, and the idea of a Government built upon it – when the "storm came and the wind blew, it fell".
- Our new Government is founded upon exactly the opposite ideas; its foundations are laid, its cornerstone rests, upon the great truth that the negro is not equal to the white man; that slavery, subordination to the superior race, is his natural and moral condition.
This view of the Negro "race" was backed by pseudoscience. The leading researcher was Dr. Samuel A. Cartwright, inventor of the mental illnesses of drapetomania (the desire of a slave to run away) and dysaesthesia aethiopica ("rascality"), both cured by whipping. The Medical Association of Louisiana set up a committee, of which he was chair, to investigate "the Diseases and Physical Peculiarities of the Negro Race". Their report, first delivered to the Medical Association in an address, was published in their journal, and then reprinted in part in the widely circulated DeBow's Review.
- Further information: Eugenics in the United States
Proposed expansion of slavery:
Whether or not slavery was to be limited to the Southern states that already had it, or whether it was to be permitted in new states made from the lands of the Louisiana Purchase and Mexican Cession, was a major issue in the 1840s and 1850s. Results included the Compromise of 1850 and the Bleeding Kansas period.
Also relatively well known are the proposals, including the Ostend Manifesto, to annex Cuba as a slave state. There was also talk of making slave states of Mexico, Nicaragua (see Walker affair) and other lands around the so-called Golden Circle. Less well-known today, though well known at the time, is that pro-slavery Southerners:
- Actively sought to reopen the transatlantic slave trade
- Funded illegal slave shipments from Africa such as the Wanderer slave shipment to Georgia in 1858.
- Wanted to reintroduce slavery in the Northern states, through federal action or Constitutional amendment making slavery legal nationwide, thus overriding state anti-slavery laws. (See Crittenden Compromise.) This was described as "well underway" by 1858.
- Said openly that slavery should by no means be limited to Negros, since in their view it was beneficial. Northern white workers, who were allegedly "wage slaves" already, would allegedly have better lives if they were enslaved.
None of these ideas got very far, but they alarmed Northerners and contributed to the growing polarization of the country.
Abolitionism in the North:
Main article: Abolitionism in the United States
Further information:
- List of abolitionists,
- Underground Railroad,
- African American founding fathers of the United States,
- and Radical Republicans
William Ellsworth, attorney for Prudence Crandall, 1834: "Slavery is a volcano, the fires of which cannot be quenched, nor its ravishes controlled. We already feel its convulsions, and if we sit idly gazing upon its flames, as they rise higher and higher, our happy republic will be buried in ruin, beneath its overwhelming energies."
Beginning during the Revolution and in the first two decades of the postwar era, every state in the North abolished slavery. These were the first abolitionist laws in the Atlantic World. However, the abolition of slavery did not necessarily mean that existing slaves became free. In some states they were forced to remain with their former owners as indentured servants: free in name only, although they could not be sold and thus families could not be split, and their children were born free.
The end of slavery did not come in New York until July 4, 1827, when it was celebrated with a big parade. However, in the 1830 census, the only state with no slaves was Vermont. In the 1840 census, there were still slaves in New Hampshire (1), Rhode Island (5), Connecticut (17), New York (4), Pennsylvania (64), Ohio (3), Indiana (3), Illinois (331), Iowa (16), and Wisconsin (11). There were none in these states in the 1850 census.
Most Northern states passed legislation for gradual abolition, first freeing children born to slave mothers (and requiring them to serve lengthy indentures to their mother's owners, often into their 20s as young adults).
In 1845, the Supreme Court of New Jersey received lengthy arguments towards "the deliverance of four thousand persons from bondage". Pennsylvania's last slaves were freed in 1847, Connecticut's in 1848, and while neither New Hampshire nor New Jersey had any slaves in the 1850 Census, and New Jersey only one and New Hampshire none in the 1860 Census, slavery was never prohibited in either state until ratification of the 13th Amendment in 1865 (and New Jersey was one of the last states to ratify it).
None of the Southern states abolished slavery before 1865, but it was not unusual for individual slaveholders in the South to free numerous slaves, often citing revolutionary ideals, in their wills.
Methodist, Quaker, and Baptist preachers traveled in the South, appealing to slaveholders to manumit their slaves, and there were "manumission societies" in some Southern states.
By 1810, the number and proportion of free blacks in the population of the United States had risen dramatically. Most free blacks lived in the North, but even in the Upper South, the proportion of free blacks went from less than one percent of all blacks to more than ten percent, even as the total number of slaves was increasing through imports.
One of the early Puritan writings on this subject was "The Selling of Joseph," by Samuel Sewall in 1700. In it, Sewall condemned slavery and the slave trade and refuted many of the era's typical justifications for slavery.
The Puritan influence on slavery was still strong at the time of the American Revolution and up until the Civil War. Of America's first seven presidents, the two who did not own slaves, John Adams and John Quincy Adams, came from Puritan New England. They were wealthy enough to own slaves, but they chose not to because they believed that it was morally wrong to do so.
In 1765, colonial leader Samuel Adams and his wife were given a slave girl as a gift. They immediately freed her. Just after the Revolution, in 1787, the Northwest Territory (which became the states of Ohio, Michigan, Indiana, Illinois, Wisconsin and part of Minnesota) was opened up for settlement. The two men responsible for establishing this territory were Manasseh Cutler and Rufus Putnam. They came from Puritan New England, and they insisted that this new territory, which doubled the size of the United States, was going to be "free soil" – no slavery. This was to prove crucial in the coming decades. If those states had become slave states, and their electoral votes had gone to Abraham Lincoln's main opponent, Lincoln would not have been elected president.
In the decades leading up to the Civil War, the abolitionists, such as the following repeatedly used the Puritan heritage of the country to bolster their cause:
The most radical anti-slavery newspaper, The Liberator, invoked the Puritans and Puritan values over a thousand times. Parker, in urging New England Congressmen to support the abolition of slavery, wrote that "The son of the Puritan ... is sent to Congress to stand up for Truth and Right ..."
Northerners predominated in the westward movement into the Midwestern territory after the American Revolution; as the states were organized, they voted to prohibit slavery in their constitutions when they achieved statehood: Ohio in 1803, Indiana in 1816 and Illinois in 1818.
What developed was a Northern block of free states united into one contiguous geographic area that generally shared an anti-slavery culture. The exceptions were the areas along the Ohio River settled by Southerners: the southern portions of Indiana, Ohio and Illinois.
Residents of those areas generally shared in Southern culture and attitudes. In addition, these areas were devoted to agriculture longer than the industrializing northern parts of these states, and some farmers used slave labor.
In Illinois, for example, while the trade in slaves was prohibited, it was legal to bring slaves from Kentucky into Illinois and use them there, as long as the slaves left Illinois one day per year (they were "visiting"). The emancipation of slaves in the North led to the growth in the population of Northern free blacks, from several hundred in the 1770s to nearly 50,000 by 1810.
Throughout the first half of the 19th century, abolitionism, a movement to end slavery, grew in strength; most abolitionist societies and supporters were in the North. They worked to raise awareness about the evils of slavery, and to build support for abolition.
After 1830, abolitionist and newspaper publisher William Lloyd Garrison promoted emancipation, characterizing slaveholding as a personal sin. He demanded that slaveowners repent and start the process of emancipation. His position increased defensiveness on the part of some Southerners, who noted the long history of slavery among many cultures. A few abolitionists, such as John Brown, favored the use of armed force to foment uprisings among the slaves, as he attempted to do at Harper's Ferry.
Most abolitionists tried to raise public support to change laws and to challenge slave laws. Abolitionists were active on the lecture circuit in the North, and often featured escaped slaves in their presentations. Writer and orator Frederick Douglass became an important abolitionist leader after escaping from slavery. Harriet Beecher Stowe's novel Uncle Tom's Cabin (1852) was an international bestseller, and along with the non-fiction companion A Key to Uncle Tom's Cabin, aroused popular sentiment against slavery. It also provoked the publication of numerous anti-Tom novels by Southerners in the years before the American Civil War.
This struggle took place amid strong support for slavery among white Southerners, who profited greatly from the system of enslaved labor. But slavery was entwined with the national economy; for instance, the banking, shipping, insurance, and manufacturing industries of New York City all had strong economic interests in slavery, as did similar industries in other major port cities in the North.
The Northern textile mills in New York and New England processed Southern cotton and manufactured clothes to outfit slaves. By 1822, half of New York City's exports were related to cotton.
Slaveholders began to refer to slavery as the "peculiar institution" to differentiate it from other examples of forced labor. They justified it as less cruel than the free labor of the North.
The principal organized bodies to advocate abolition and anti-slavery reforms in the north were the Pennsylvania Abolition Society and the New York Manumission Society. Before the 1830s the antislavery groups called for gradual emancipation. By the late 1820s, under the impulse of religious evangelicals such as Beriah Green, the sense emerged that owning slaves was a sin and the owner had to immediately free himself from this grave sin by immediate emancipation.
Prohibiting the international trade
Main article: Act Prohibiting Importation of Slaves
Under the Constitution, Congress could not prohibit the import slave trade that was allowed in South Carolina until 1808. However, the third Congress regulated against it in the Slave Trade Act of 1794, which prohibited American shipbuilding and outfitting for the trade.
Subsequent acts in 1800 and 1803 sought to discourage the trade by banning American investment in the trade, and American employment on ships in the trade, as well as prohibiting importation into states that had abolished slavery, which all states except South Carolina had by 1807.
The final Act Prohibiting Importation of Slaves was adopted in 1807 and went into effect in 1808. However, illegal importation of African slaves (smuggling) was common. The Cuban slave trade between 1796 and 1807 was dominated by American slave ships. Despite the 1794 Act, Rhode Island slave ship owners found ways to continue supplying the slave-owning states. The overall U.S. slave-ship fleet in 1806 was estimated to be almost 75% the size of that of the British.
After Great Britain and the United States outlawed the international slave trade in 1807, British slave trade suppression activities began in 1808 through diplomatic efforts and the formation of the Royal Navy's West Africa Squadron in 1809.
The United States denied the Royal Navy the right to stop and search U.S. ships suspected as slave ships, so not only were American ships unhindered by British patrols, but slavers from other countries would fly the American flag to try to avoid being stopped. Co-operation between the United States and Britain was not possible during the War of 1812 or the period of poor relations in the following years.
In 1820, the United States Navy sent USS Cyane under the command of Captain Edward Trenchard to patrol the slave coasts of West Africa. Cyane seized four American slave ships in her first year on station. Trenchard developed a good level of co-operation with the Royal Navy. Four additional U.S. warships were sent to the African coast in 1820 and 1821. A total of 11 American slave ships were taken by the U.S. Navy over this period. Then American enforcement activity reduced.
There was still no agreement between the United States and Britain on a mutual right to board suspected slave traders sailing under each other's flag. Attempts to reach such an agreement stalled in 1821 and 1824 in the United States Senate. A U.S. Navy presence, however sporadic, did result in American slavers sailing under the Spanish flag, but still as an extensive trade.
The Webster-Ashburton Treaty of 1842 set a guaranteed minimum level of patrol activity by the U.S. Navy and the Royal Navy, and formalized the level of co-operation that had existed in 1820. Its effects, however, were minimal while opportunities for greater co-operation were not taken.
The U.S. transatlantic slave trade was not effectively suppressed until 1861, during Lincoln's presidency, when a treaty with Britain was signed whose provisions included allowing the Royal Navy to board, search and arrest slavers operating under the American flag.
War of 1812:
See also: Black refugee (War of 1812)
During the War of 1812, British Royal Navy commanders of the blockading fleet were instructed to offer freedom to defecting American slaves, as the Crown had during the Revolutionary War. Thousands of escaped slaves went over to the Crown with their families.
Men were recruited into the Corps of Colonial Marines on occupied Tangier Island, in the Chesapeake Bay. Many freed American slaves were recruited directly into existing West Indian regiments, or newly created British Army units. The British later resettled a few thousand freed slaves to Nova Scotia. Their descendants, together with descendants of the black people resettled there after the Revolution, have established the Black Loyalist Heritage Museum.
Slaveholders, primarily in the South, had considerable "loss of property" as thousands of slaves escaped to the British lines or ships for freedom, despite the difficulties. The planters' complacency about slave "contentment" was shocked by seeing that slaves would risk so much to be free. Afterward, when some freed slaves had been settled at Bermuda, slaveholders such as Major Pierce Butler of South Carolina tried to persuade them to return to the United States, to no avail.
The Americans protested that Britain's failure to return all slaves violated the Treaty of Ghent. After arbitration by the Tsar of Russia, the British paid $1,204,960 in damages (about $31.2 million in today's money) to Washington, which reimbursed the slaveowners.
Slave rebellions:
Further information:
According to Herbert Aptheker, "there were few phases of ante-bellum Southern life and history that were not in some way influenced by the fear of, or the actual outbreak of, militant concerted slave action."
Historians in the 20th century identified 250 to 311 slave uprisings in U.S. and colonial history. Those after 1776 include:
- Gabriel's conspiracy (1800)
- Igbo Landing slave escape and mass suicide (1803)
- Chatham Manor Rebellion (1805)
- 1811 German Coast uprising, (1811)
- George Boxley Rebellion (1815)
- Denmark Vesey's conspiracy (1822)
- Nat Turner's slave rebellion (1831)
- Black Seminole Slave Rebellion (1835–1838)
- Amistad seizure (1839)
- Creole case (1841)
- 1842 Slave Revolt in the Cherokee Nation
In 1831, Nat Turner, a literate slave who claimed to have spiritual visions, organized a slave rebellion in Southampton County, Virginia; it was sometimes called the Southampton Insurrection. Turner and his followers killed nearly sixty white inhabitants, mostly women and children.
Many of the men in the area were attending a religious event in North Carolina. Eventually Turner was captured with 17 other rebels, who were subdued by the militia. Turner and his followers were hanged, and Turner's body was flayed.
In a frenzy of fear and retaliation, the militia killed more than 100 slaves who had not been involved in the rebellion. Planters whipped hundreds of innocent slaves to ensure resistance was quelled.
This rebellion prompted Virginia and other slave states to pass more restrictions on slaves and free people of color, controlling their movement and requiring more white supervision of gatherings. In 1835, North Carolina withdrew the franchise for free people of color, and they lost their vote.
Post-revolution Southern manumissions:
Although Virginia, Maryland and Delaware were slave states, the latter two already had a high proportion of free blacks by the outbreak of war. Following the Revolution, the three legislatures made manumission easier, allowed by deed or will. Quaker and Methodist ministers particularly urged slaveholders to free their slaves.
The number and proportion of freed slaves in these states rose dramatically until 1810. More than half of the number of free blacks in the United States were concentrated in the Upper South. The proportion of free blacks among the black population in the Upper South rose from less than 1% in 1792 to more than 10% by 1810. In Delaware, nearly 75% of blacks were free by 1810.
In the United States as a whole, the number of free blacks reached 186,446, or 13.5% of all blacks, by 1810. After that period, few slaves were freed, as the development of cotton plantations featuring short-staple cotton in the Deep South drove up the internal demand for slaves in the domestic slave trade and high prices being paid for them.
South Carolina made manumission more difficult, requiring legislative approval of every instance of manumission. Alabama banned free black people from the state beginning in 1834; free people of color who crossed the state line were subject to enslavement.
Free black people in Arkansas after 1843 had to buy a $500 good-behavior bond, and no unenslaved black person was legally allowed to move into the state.
Female slave owners:
Women exercised their right to own and control human property without their husbands' interference or permission, and they were active participants in the slave trade. For example, in South Carolina 40% of bills of sale for slaves from the 1700s to the present included a female buyer or seller. Women also governed their slaves in a manner similar to men, engaging in the same levels of physical disciplining. Like men, they brought lawsuits against those who jeopardized their ownership to their slaves.
Black slave owners:
Main article: Black slave owners in the United States
Despite the longstanding color line in the United States, some African Americans were slave owners themselves, some in cities and others as plantation owners in the country. Slave ownership signified both wealth and increased social status. Black slave owners were uncommon, however, as "of the two and a half million African Americans living in the United States in 1850, the vast majority [were] enslaved."
Native American slave owners:
Main article: Native American slave ownership
After 1800, some of the Cherokee and the other four civilized tribes of the Southeast started buying and using black slaves as labor. They continued this practice after removal to Indian Territory in the 1830s, when as many as 15,000 enslaved blacks were taken with them.
The nature of slavery in Cherokee society often mirrored that of white slave-owning society. The law barred intermarriage of Cherokees and enslaved African Americans, but Cherokee men had unions with enslaved women, resulting in mixed-race children. Cherokee who aided slaves were punished with one hundred lashes on the back. In Cherokee society, persons of African descent were barred from holding office even if they were also racially and culturally Cherokee. They were also barred from bearing arms and owning property. The Cherokee prohibited the teaching of African Americans to read and write.
By contrast, the Seminole welcomed into their nation African Americans who had escaped slavery (Black Seminoles). Historically, the Black Seminoles lived mostly in distinct bands near the Native American Seminole. Some were held as slaves of particular Seminole leaders.
Seminole practice in Florida had acknowledged slavery, though not the chattel slavery model common elsewhere. It was, in fact, more like feudal dependency and taxation.
The relationship between Seminole blacks and natives changed following their relocation in the 1830s to territory controlled by the Creek who had a system of chattel slavery. Pro slavery pressure from Creek and pro-Creek Seminole and slave raiding led to many Black Seminoles escaping to Mexico.
High demand and smuggling:
The United States Constitution, adopted in 1787, prevented Congress from completely banning the importation of slaves until 1808, although Congress regulated against the trade in the Slave Trade Act of 1794, and in subsequent Acts in 1800 and 1803. During and after the Revolution, the states individually passed laws against importing slaves.
By contrast, the states of Georgia and South Carolina reopened their trade due to demand by their upland planters, who were developing new cotton plantations: Georgia from 1800 until December 31, 1807, and South Carolina from 1804. In that period, Charleston traders imported about 75,000 slaves, more than were brought to South Carolina in the 75 years before the Revolution. Approximately 30,000 were imported to Georgia.
By January 1, 1808, when Congress banned further imports, South Carolina was the only state that still allowed importation of enslaved people. The domestic trade became extremely profitable as demand rose with the expansion of cultivation in the Deep South for cotton and sugar cane crops.
Slavery in the United States became, more or less, self-sustaining by natural increase among the current slaves and their descendants. Maryland and Virginia viewed themselves as slave producers, seeing "producing slaves" as resembling animal husbandry. Workers, including many children, were relocated by force from the upper to the lower South.
Despite the ban, slave imports continued through smugglers bringing in slaves past the U.S. Navy's African Slave Trade Patrol to South Carolina, and overland from Texas and Florida, both under Spanish control. Congress increased the punishment associated with importing slaves, classifying it in 1820 as an act of piracy, with smugglers subject to harsh penalties, including death if caught.
After that, "it is unlikely that more than 10,000 [slaves] were successfully landed in the United States." But, some smuggling of slaves into the United States continued until just before the start of the Civil War.
Further information:
- Echo (1845 ship),
- Wanderer (slave ship),
- Clotilda (slave ship),
- William Walker (filibuster),
- and Movement to reopen the transatlantic slave trade
Colonization movement:
Main articles:
In the early part of the 19th century, other organizations were founded to take action on the future of black Americans. Some advocated removing free black people from the United States to places where they would enjoy greater freedom; some endorsed colonization in Africa, while others advocated emigration, usually to Haiti.
During the 1820s and 1830s, the American Colonization Society (ACS) was the primary organization to implement the "return" of black Americans to Africa. The ACS was made up mostly of Quakers and slaveholders, and they found uneasy common ground in support of what was incorrectly called "repatriation".
By this time, however, most black Americans were native-born and did not want to emigrate, saying they were no more African than white Americans were British. Rather, they wanted full rights in the United States, where their families had lived and worked for generations.
In 1822, the ACS and affiliated state societies established what would become the colony of Liberia, in West Africa. The ACS assisted thousands of freedmen and free blacks (with legislated limits) to emigrate there from the United States.
Many white people considered this preferable to emancipation in the United States. Henry Clay, one of the founders and a prominent slaveholder politician from Kentucky, said that blacks faced:
- ...unconquerable prejudice resulting from their color, they never could amalgamate with the free whites of this country. It was desirable, therefore, as it respected them, and the residue of the population of the country, to drain them off.
Deportation would also be a way to prevent reprisals against former slaveholders and white people in general, as had occurred in the 1804 Haiti massacre, which had contributed to a consuming fear amongst whites of retributive black violence, a phobia dubbed Haitianism.
Domestic slave trade and forced migration:
Main article: Slave trade in the United States
Further information:
- List of American slave traders,
- Slave markets and slave jails in the United States,
- Kidnapping into slavery in the United States
The U.S. Constitution barred the federal government from prohibiting the importation of slaves for twenty years. Various states passed bans on the international slave trade during that period; by 1808, the only state still allowing the importation of African slaves was South Carolina.
After 1808, legal importation of slaves ceased, although there was smuggling via Spanish Florida and the disputed Gulf Coast to the west. This route all but ended after Florida became a U.S. territory in 1821 (but see slave ships Wanderer and Clotilda).
The replacement for the importation of slaves from abroad was increased domestic production. Virginia and Maryland had little new agricultural development, and their need for slaves was mostly for replacements for decedents. Normal reproduction more than supplied these: Virginia and Maryland had surpluses of slaves. Their tobacco farms were "worn out" and the climate was not suitable for cotton or sugar cane.
The surplus was even greater because slaves were encouraged to reproduce (though they could not marry). The pro-slavery Virginian Thomas Roderick Dew wrote in 1832 that Virginia was a "negro-raising state"; i.e. Virginia "produced" slaves. According to him, in 1832 Virginia exported "upwards of 6,000 slaves" per year, "a source of wealth to Virginia".
A newspaper from 1836 gives the figure as 40,000, earning for Virginia an estimated $24,000,000 per year. Demand for slaves was the strongest in what was then the southwest of the country: Alabama, Mississippi, and Louisiana, and, later, Texas, Arkansas, and Missouri.
Here there was abundant land suitable for plantation agriculture, which young men with some capital established. This was expansion of the white, monied population: younger men seeking their fortune.
The most valuable crop that could be grown on a plantation in that climate was cotton. That crop was labor-intensive, and the least-costly laborers were slaves. Demand for slaves exceeded the supply in the southwest; therefore slaves, never cheap if they were productive, went for a higher price.
As portrayed in Uncle Tom's Cabin (the "original" cabin was in Maryland), "selling South" was greatly feared. A recently (2018) publicized example of the practice of "selling South" is the 1838 sale by Jesuits of 272 slaves from Maryland, to plantations in Louisiana, to benefit Georgetown University, which has been described as "ow[ing] its existence" to this transaction.
The growing international demand for cotton led many plantation owners further west in search of suitable land. In addition, the invention of the cotton gin in 1793 enabled profitable processing of short-staple cotton, which could readily be grown in the uplands. The invention revolutionized the cotton industry by increasing fifty-fold the quantity of cotton that could be processed in a day.
At the end of the War of 1812, fewer than 300,000 bales of cotton were produced nationally. By 1820, the amount of cotton produced had increased to 600,000 bales, and by 1850 it had reached 4,000,000. There was an explosive growth of cotton cultivation throughout the Deep South and greatly increased demand for slave labor to support it. As a result, manumissions decreased dramatically in the South.
Most of the slaves sold from the Upper South were from Maryland, Virginia and the Carolinas, where changes in agriculture decreased the need for their labor and the demand for slaves. Before 1810, primary destinations for the slaves who were sold were Kentucky and Tennessee, but, after 1810, the Deep South states of Georgia, Alabama, Mississippi, Louisiana and Texas received the most slaves. This is where cotton became "king". Meanwhile, the Upper South states of Kentucky and Tennessee joined the slave-exporting states.
By 1815, the domestic slave trade had become a major economic activity in the United States; it lasted until the 1860s. Between 1830 and 1840, nearly 250,000 slaves were taken across state lines. In the 1850s, more than 193,000 enslaved persons were transported, and historians estimate nearly one million in total took part in the forced migration of this new "Middle Passage."
By 1860, the slave population in the United States had reached four million. Of the 1,515,605 free families in the fifteen slave states in 1860, nearly 400,000 held slaves (roughly one in four, or 25%), amounting to 8% of all American families.
The historian Ira Berlin called this forced migration of slaves the "Second Middle Passage" because it reproduced many of the same horrors as the Middle Passage (the name given to the transportation of slaves from Africa to North America). These sales of slaves broke up many families and caused much hardship.
Characterizing it as the "central event" in the life of a slave between the American Revolution and the Civil War, Berlin wrote that, whether slaves were directly uprooted or lived in fear that they or their families would be involuntarily moved, "the massive deportation traumatized black people, both slave and free". Individuals lost their connection to families and clans.
Added to the earlier colonists combining slaves from different tribes, many ethnic Africans lost their knowledge of varying tribal origins in Africa. Most were descended from families that had been in the United States for many generations.
The firm of Franklin and Armfield was a leader in this trade. In the 1840s, almost 300,000 slaves were transported, with Alabama and Mississippi receiving 100,000 each. During each decade between 1810 and 1860, at least 100,000 slaves were moved from their state of origin.
In the final decade before the Civil War, 250,000 were transported. Michael Tadman wrote in Speculators and Slaves: Masters, Traders, and Slaves in the Old South (1989) that 60–70% of inter-regional migrations were the result of the sale of slaves. In 1820, a slave child in the Upper South had a 30 percent chance of being sold South by 1860. The death rate for the slaves on their way to their new destination across the American South was less than that suffered by captives shipped across the Atlantic Ocean, but mortality nevertheless was higher than the normal death rate.
Slave traders transported two-thirds of the slaves who moved West. Only a minority moved with their families and existing master. Slave traders had little interest in purchasing or transporting intact slave families; in the early years, planters demanded only the young male slaves needed for heavy labor.
Later, in the interest of creating a "self-reproducing labor force", planters purchased nearly equal numbers of men and women. Berlin wrote: "The internal slave trade became the largest enterprise in the South outside the plantation itself, and probably the most advanced in its employment of modern transportation, finance, and publicity.
The slave trade industry developed its own unique language, with terms such as "prime hands, bucks, breeding wenches, and "fancy girls" coming into common use.
The expansion of the interstate slave trade contributed to the "economic revival of once depressed seaboard states" as demand accelerated the value of slaves who were subject to sale.
Some traders moved their "chattels" by sea, with Norfolk to New Orleans being the most common route, but most slaves were forced to walk overland. Others were shipped downriver from such markets as Louisville on the Ohio River, and Natchez on the Mississippi.
Traders created regular migration routes served by a network of slave pens, yards and warehouses needed as temporary housing for the slaves. In addition, other vendors provided clothes, food and supplies for slaves. As the trek advanced, some slaves were sold and new ones purchased. Berlin concluded, "In all, the slave trade, with its hubs and regional centers, its spurs and circuits, reached into every cranny of southern society. Few southerners, black or white, were untouched."
Once the trip ended, slaves faced a life on the frontier significantly different from most labor in the Upper South. Clearing trees and starting crops on virgin fields was harsh and backbreaking work. A combination of inadequate nutrition, bad water and exhaustion from both the journey and the work weakened the newly arrived slaves and produced casualties.
New plantations were located at rivers' edges for ease of transportation and travel. Mosquitoes and other environmental challenges spread disease, which took the lives of many slaves. They had acquired only limited immunities to lowland diseases in their previous homes. The death rate was so high that, in the first few years of hewing a plantation out of the wilderness, some planters preferred whenever possible to use rented slaves rather than their own.
The harsh conditions on the frontier increased slave resistance and led owners and overseers to rely on violence for control. Many of the slaves were new to cotton fields and unaccustomed to the "sunrise-to-sunset gang labor" required by their new life. Slaves were driven much harder than when they had been in growing tobacco or wheat back East.
Slaves had less time and opportunity to improve the quality of their lives by raising their own livestock or tending vegetable gardens, for either their own consumption or trade, as they could in the East.
In Louisiana, French colonists had established sugar cane plantations and exported sugar as the chief commodity crop. After the Louisiana Purchase in 1803, Americans entered the state and joined the sugar cultivation. Between 1810 and 1830, planters bought slaves from the North and the number of slaves increased from fewer than 10,000 to more than 42,000. Planters preferred young males, who represented two-thirds of the slave purchases. Dealing with sugar cane was even more physically demanding than growing cotton. The largely young, unmarried male slave force made the reliance on violence by the owners "especially savage"
New Orleans became nationally important as a slave market and port, as slaves were shipped from there upriver by steamboat to plantations on the Mississippi River; it also sold slaves who had been shipped downriver from markets such as Louisville. By 1840, it had the largest slave market in North America. It became the wealthiest and the fourth-largest city in the nation, based chiefly on the slave trade and associated businesses. The trading season was from September to May, after the harvest.
The notion that slave traders were social outcasts of low reputation, even in the South, was initially promulgated by defensive southerners and later by figures like historian Ulrich B. Phillips.
Historian Frederic Bancroft, author of Slave-Trading in the Old South (1931) found—to the contrary of Phillips' position—that many traders were esteemed members of their communities. Contemporary researcher Steven Deyle argues that the "trader's position in society was not unproblematic and owners who dealt with the trader felt the need to satisfy themselves that they acted honorably," while Michael Tadman contends that "'trader as outcast' operated at the level of propaganda" whereas white slave owners almost universally professed a belief that slaves were not human like them, and thus dismissed the consequences of slave trading as beneath consideration.
Similarly, historian Charles Dew read hundreds of letters to slave traders and found virtually zero narrative evidence for guilt, shame, or contrition about the slave trade: "If you begin with the absolute belief in white supremacy—unquestioned white superiority/unquestioned black inferiority—everything falls neatly into place: the African is inferior racial 'stock,' living in sin and ignorance and barbarism and heathenism on the 'Dark Continent' until enslaved...Slavery thus miraculously becomes a form of 'uplift' for this supposedly benighted and brutish race of people. And once notions of white supremacy and black inferiority are in place in the American South, they are passed on from one generation to the next with all the certainty and inevitability of a genetic trait."
In the 1828 presidential election, candidate Andrew Jackson was strongly criticized by opponents as a slave trader who transacted in slaves in defiance of modern standards of morality.
See also: Bibliography of the slave trade in the United States
Treatment
Main article: Treatment of the enslaved in the United States
Further information:
- Slave health on plantations in the United States
- Slave quarters in the United States
- Field slaves in the United States
The treatment of slaves in the United States varied widely depending on conditions, time, and place, but in general it was brutal, especially on plantations. Whippings and rape were routine. The power relationships of slavery corrupted many whites who had authority over slaves, with children showing their own cruelty.
Masters and overseers resorted to physical punishments to impose their wills. Slaves were punished by whipping, shackling, hanging, beating, burning, mutilation, branding and imprisonment. Punishment was most often meted out in response to disobedience or perceived infractions, but sometimes abuse was carried out to re-assert the dominance of the master or overseer of the slave.
Treatment was usually harsher on large plantations, which were often managed by overseers and owned by absentee slaveholders, conditions permitting abuses.
William Wells Brown, who escaped to freedom, reported that on one plantation, slave men were required to pick eighty pounds per day of cotton, while women were required to pick seventy pounds; if any slave failed in his or her quota, they were subject to whip lashes for each pound they were short. The whipping post stood next to the cotton scales.
A New York man who attended a slave auction in the mid-19th century reported that at least three-quarters of the male slaves he saw at sale had scars on their backs from whipping. By contrast, small slave-owning families had closer relationships between the owners and slaves; this sometimes resulted in a more humane environment but was not a given.
Historian Lawrence M. Friedman wrote: "Ten Southern codes made it a crime to mistreat a slave. ... Under the Louisiana Civil Code of 1825 (art. 192), if a master was "convicted of cruel treatment", the judge could order the sale of the mistreated slave, presumably to a better master. Masters and overseers were seldom prosecuted under these laws. No slave could give testimony in the courts.
According to Adalberto Aguirre's research, 1,161 slaves were executed in the United States between the 1790s and 1850s. Quick executions of innocent slaves as well as suspects typically followed any attempted slave rebellions, as white militias overreacted with widespread killings that expressed their fears of rebellions, or suspected rebellions.
Although most slaves had lives that were very restricted in terms of their movements and agency, exceptions existed to virtually every generalization; for instance, there were also slaves who had considerable freedom in their daily lives: slaves allowed to rent out their labor and who might live independently of their master in cities, slaves who employed white workers, and slave doctors who treated upper-class white patients.
After 1820, in response to the inability to import new slaves from Africa and in part to abolitionist criticism, some slaveholders improved the living conditions of their slaves, to encourage them to be productive and to try to prevent escapes. It was part of a paternalistic approach in the antebellum era that was encouraged by ministers trying to use Christianity to improve the treatment of slaves.
Slaveholders published articles in Southern agricultural journals to share best practices in treatment and management of slaves; they intended to show that their system was better than the living conditions of northern industrial workers.
Medical care for slaves was limited in terms of the medical knowledge available to anyone. It was generally provided by other slaves or by slaveholders' family members, although sometimes "plantation physicians", like J. Marion Sims, were called by the owners to protect their investment by treating sick slaves. Many slaves possessed medical skills needed to tend to each other, and used folk remedies brought from Africa. They also developed new remedies based on American plants and herbs.
An estimated nine percent of slaves were disabled due to a physical, sensory, psychological, neurological, or developmental condition. However, slaves were often described as disabled if they were unable to work or bear a child, and were often subjected to harsh treatment as a result.
According to Andrew Fede, an owner could be held criminally liable for killing a slave only if the slave he killed was "completely submissive and under the master's absolute control". For example, in 1791 the North Carolina General Assembly defined the willful killing of a slave as criminal murder, unless done in resisting or under moderate correction (that is, corporal punishment).
While slaves' living conditions were poor by modern standards, Robert Fogel argued that all workers, free or slave, during the first half of the 19th century were subject to hardship. Unlike free individuals, however, enslaved people were far more likely to be underfed, physically punished, sexually abused, or killed, with no recourse, legal or otherwise, against those who perpetrated these crimes against them.
In a very grim fashion, the commodification of the human body was legal in the case of African slaves as they were not legally seen as fully human: For the reason of slave punishment, decoration, or self-expression:
- the skin of slaves was in many instances allowed to be made into leather for furniture, accessories, and clothing.
- Slave hair could be shaved and used for stuffing in pillows and furniture.
- In some instances, the inner body tissue of slaves (fat, bones, etc.) could be made into soap, trophies, and other commodities.
- Medical experimentation on slaves was also commonplace.
- Slaves were routinely used as medical specimens forced to take part in:
- experimental surgeries,
- amputations,
- disease research,
- and developing medical techniques.
- Slaves were routinely used as medical specimens forced to take part in:
- In many cases, slave cadavers were used in demonstrations and dissection tables.
Sexual abuse, sexual exploitation, and forced breeding:
Main articles:
- Slave breeding in the United States,
- Children of the plantation,
- Shadow family,
- and Enslaved women's resistance in the United States and Caribbean
- Because of the power relationships at work, slave women in the United States were at high risk for rape and sexual abuse.
- Their children were repeatedly taken away from them and sold as farm animals; usually they never saw each other again.
Many slaves fought back against sexual attacks, and some died resisting. Others carried psychological and physical scars from the attacks.
Sexual abuse of slaves was partially rooted in a patriarchal Southern culture that treated black women as property or chattel.
Southern culture strongly policed against sexual relations between white women and black men on the purported grounds of racial purity but, by the late 18th century, the many mixed-race slaves and slave children showed that white men had often taken advantage of slave women. Wealthy planter widowers, notably such as John Wayles and his son-in-law Thomas Jefferson, took slave women as concubines; each had six children with his partner: Elizabeth Hemings and her daughter Sally Hemings (the half-sister of Jefferson's late wife), respectively.
Both Mary Chesnut and Fanny Kemble, wives of planters, wrote about this issue in the antebellum South in the decades before the Civil War. Sometimes planters used mixed-race slaves as house servants or favored artisans because they were their children or other relatives.
As a result of centuries of slavery and such relationships, DNA studies have shown that the vast majority of African Americans also have historic European ancestry, generally through paternal lines.
The prohibition on the importation of slaves into the United States after 1808 limited the supply of slaves in the United States. This came at a time when the invention of the cotton gin enabled the expansion of cultivation in the uplands of short-staple cotton, leading to clearing lands cultivating cotton through large areas of the Deep South, especially the Black Belt. The demand for labor in the area increased sharply and led to an expansion of the internal slave market.
At the same time, the Upper South had an excess number of slaves because of a shift to mixed-crops agriculture, which was less labor-intensive than tobacco. To add to the supply of slaves, slaveholders looked at the fertility of slave women as part of their productivity, and intermittently forced the women to have large numbers of children. During this time period, the terms "breeders", "breeding slaves", "child bearing women", "breeding period", and "too old to breed" became familiar.
As it became popular on many plantations to breed slaves for strength, fertility, or simple extra labor, there grew many documented instances of "breeding farms" in the United States where slaves were forced to conceive and birth as many new slaves as possible, the largest farms of which were located in the states of Virginia and Maryland.
Because the industry of slave breeding came from a desire for larger than natural population growth of slaves, slaveowners often turned towards systematic practices for creating more slaves. This often meant:
- forced pairings,
- back-to-back rapes and pregnancies on female slaves,
- and even incest upon enslaved families.
In the United States in the early 19th century, owners of female slaves could freely and legally use them as sexual objects. This follows free use of female slaves on slaving vessels by the crews.
The slaveholder has it in his power, to violate the chastity of his slaves. And not a few are beastly enough to exercise such power. Hence it happens that, in some families, it is difficult to distinguish the free children from the slaves. It is sometimes the case, that the largest part of the master's own children are born, not of his wife, but of the wives and daughters of his slaves, whom he has basely prostituted as well as enslaved. "This vice, this bane of society, has already become so common, that it is scarcely esteemed a disgrace."
"Fancy" was a code word which indicated that the girl or young woman was suitable for or trained for sexual use. In some cases, children were also abused in this manner. The sale of a 13-year-old "nearly a fancy" is documented. Zephaniah Kingsley, Jr., bought his wife when she was 13.
Furthermore, enslaved women who were old enough to bear children were encouraged to procreate, which raised their value as slaves, since their children would eventually provide labor or be sold, enriching the owners. Enslaved women were sometimes medically treated to enable or encourage their fertility.
The variations in skin color found in the United States make it obvious how often black women were impregnated by whites. For example, in the 1850 Census, 75.4% of "free negros" in Florida were described as mulattos, of mixed race.
Nevertheless, it is only very recently, with DNA studies, that any sort of reliable number can be provided, and the research has only begun. Light-skinned girls, who contrasted with the darker field workers, were preferred.
As Caroline Randall Williams was quoted in The New York Times: "You Want a Confederate Monument? My Body Is a Confederate Monument." "I have rape-colored skin," she added.
The sexual use of black slaves by either slave owners or by those who could purchase the temporary services of a slave took various forms. A slaveowner, or his teenage son, could go to the slave quarters area of the plantation and do what he wanted, with minimal privacy if any. It was common for a "house" female (housekeeper, maid, cook, laundress, or nanny) to be raped by one or more members of the household. Houses of prostitution throughout the slave states were largely staffed by female slaves providing sexual services, to their owners' profit. There were a small number of free black females engaged in prostitution, or concubinage, especially in New Orleans.
Slave owners who engaged in sexual activity with female slaves "were often the elite of the community. They had little need to worry about public scorn." These relationships "appear to have been tolerated and in some cases even quietly accepted". "Southern women ... do not trouble themselves about it". Franklin and Armfield, who were definitely the elite of the community, joked frequently in their letters about the black women and girls that they were raping. It never occurred to them that there was anything wrong in what they were doing.
Light-skinned young girls were sold openly for sexual use; their price was much higher than that of a field hand. Special markets for the fancy girl trade existed in New Orleans and Lexington, Kentucky. Historian Philip Shaw describes an occasion when Abraham Lincoln and Allen Gentry witnessed such sales in New Orleans in 1828:
Gentry vividly remembered a day in New Orleans when he and the nineteen-year-old Lincoln came upon a slave market. Pausing to watch, Gentry recalled looking down at Lincoln's hands and seeing that he "doubled his fists tightly; his knuckles went white". Men wearing black coats and white hats buy field hands, "black and ugly", for $500 to 800. And then the real horror begins: "When the sale of "fancy girls" began, Lincoln, "unable to stand it any longer", muttered to Gentry "Allen that's a disgrace. If I ever get a lick at that thing I'll hit it hard."
Those girls who were "considered educated and refined, were purchased by the wealthiest clients, usually plantation owners, to become personal sexual companions". "There was a great demand in New Orleans for 'fancy girls'."
The issue which did come up frequently was the threat of sexual intercourse between black males and white females. Just as the black women were perceived as having "a trace of Africa, that supposedly incited passion and sexual wantonness", the men were perceived as savages, unable to control their lust, given an opportunity.
Another approach to the question was offered by Quaker and Florida planter Zephaniah Kingsley, Jr. He advocated, and personally practiced, deliberate racial mixing through marriage, as part of his proposed solution to the slavery issue: racial integration, called "amalgamation" at the time. In an 1829 Treatise, he stated that mixed-race people were healthier and often more beautiful, that interracial sex was hygienic, and slavery made it convenient. Because of these views, tolerated in Spanish Florida, he found it impossible to remain long in Territorial Florida, and moved with his slaves and multiple wives to a plantation, Mayorasgo de Koka, in Haiti (now in the Dominican Republic). There were many others who less flagrantly practiced interracial, common-law marriages with slaves (see Partus sequitur ventrem).
Slave codes:
Main article: Slave codes
Further information: See also:
To help regulate the relationship between slave and owner, including legal support for keeping the slave as property, states established slave codes, most based on laws existing since the colonial era. The code for the District of Columbia defined a slave as "a human being, who is by law deprived of his or her liberty for life, and is the property of another".
While each state had its own slave code, many concepts were shared throughout the slave states. According to the slave codes, some of which were passed in reaction to slave rebellions, teaching a slave to read or write was illegal. This prohibition was unique to American slavery, believed to reduce slaves forming aspirations that could lead to escape or rebellion.
Informal education occurred when white children taught slave companions what they were learning; in other cases, adult slaves learned from free artisan workers, especially if located in cities, where there was more freedom of movement.
In Alabama, slaves were not allowed to leave their master's premises without written consent or passes. This was a common requirement in other states as well, and locally run patrols (known to slaves as pater rollers) often checked the passes of slaves who appeared to be away from their plantations.
In Alabama slaves were prohibited from trading goods among themselves. In Virginia, a slave was not permitted to drink in public within one mile of his master or during public gatherings. Slaves were not permitted to carry firearms in any of the slave states.
Slaves were generally prohibited by law from associating in groups, with the exception of worship services (a reason why the Black Church is such a notable institution in black communities today). Following Nat Turner's rebellion in 1831, which raised white fears throughout the South, some states also prohibited or restricted religious gatherings of slaves, or required that they be officiated by white men. Planters feared that group meetings would facilitate communication among slaves that could lead to rebellion. Slaves held private, secret "brush meetings" in the woods.
In Ohio, an emancipated slave was prohibited from returning to the state in which he or she had been enslaved. Other Northern states discouraged the settling of free blacks within their boundaries. Fearing the influence of free blacks, Virginia and other Southern states passed laws to require blacks who had been freed to leave the state within a year (or sometimes less time) unless granted a stay by an act of the legislature.
Further information: Marriage of enslaved people (United States)
Religion:
Further information: Religion of black Americans and Black Catholicism
Africans brought their religions with them from Africa, including Islam, Catholicism, and traditional religions.
Prior to the American Revolution, masters and revivalists spread Christianity to slave communities, including:
- Catholicism in:
- Spanish Florida
- California,
- and in French and Spanish Louisiana,
- and Protestantism in English colonies, supported by the Society for the Propagation of the Gospel.
In the First Great Awakening of the mid-18th century, Baptists and Methodists from New England preached a message against slavery, encouraged masters to free their slaves, converted both slaves and free blacks, and gave them active roles in new congregations.
The first independent black congregations were started in the South before the Revolution, in South Carolina and Georgia. Believing that, "slavery was contrary to the ethics of Jesus", Christian congregations and church clergy, especially in the North, played a role in the Underground Railroad (see next topic), especially Wesleyan Methodists, Quakers and Congregationalists.
Over the decades and with the growth of slavery throughout the South, some Baptist and Methodist ministers gradually changed their messages to accommodate the institution. After 1830, white Southerners argued for the compatibility of Christianity and slavery, with a multitude of both Old and New Testament citations. They promoted Christianity as encouraging better treatment of slaves and argued for a paternalistic approach.
In the 1840s and 1850s, the issue of accepting slavery split the nation's largest religious denominations (the Methodist, Baptist and Presbyterian churches) into separate Northern and Southern organizations; see Methodist Episcopal Church South, Southern Baptist Convention, and Presbyterian Church in the Confederate States of America). Schisms occurred, such as that between the Wesleyan Methodist Church and the Methodist Episcopal Church.
Southern slaves generally attended their masters' white churches, where they often outnumbered the white congregants. They were usually permitted to sit only in the back or in the balcony. They listened to white preachers, who emphasized the obligation of slaves to keep in their place, and acknowledged the slave's identity as both person and property.
Preachers taught the master's responsibility and the concept of appropriate paternal treatment, using Christianity to improve conditions for slaves, and to treat them "justly and fairly". This included masters having self-control, not disciplining under anger, not threatening, and ultimately fostering Christianity among their slaves by example.
Slaves also created their own religious observances, meeting alone without the supervision of their white masters or ministers. The larger plantations with groups of slaves numbering 20, or more, tended to be centers of nighttime meetings of one or several plantation slave populations.
These congregations revolved around a singular preacher, often illiterate with limited knowledge of theology, who was marked by his personal piety and ability to foster a spiritual environment. African Americans developed a theology related to Biblical stories having the most meaning for them, including the hope for deliverance from slavery by their own Exodus. One lasting influence of these secret congregations is the African American spiritual.
Mandatory illiteracy:
Main article: Anti-literacy laws in the United States
Further information:
- Education during the slave period in the United States
- Education of freed people during the Civil War
In a feature unique to American slavery, legislatures across the South enacted new laws to curtail the already limited rights of African Americans.
For example, Virginia prohibited blacks, free or slave, from practicing preaching, prohibited them from owning firearms, and forbade anyone to teach slaves or free blacks how to read. It specified heavy penalties for both student and teacher if slaves were taught, including whippings or jail.
[E]very assemblage of negroes for the purpose of instruction in reading or writing, or in the night time for any purpose, shall be an unlawful assembly. Any justice may issue his warrant to any office or other person, requiring him to enter any place where such assemblage may be, and seize any negro therein; and he, or any other justice, may order such negro to be punished with stripes.
Slave owners saw literacy as a threat to the institution of slavery and their financial investment in it; as a North Carolina statute passed in 1830-1831 stated, "Teaching slaves to read and write, tends to excite dissatisfaction in their minds, and to produce insurrection and rebellion."
Literacy enabled the enslaved to read the writings of abolitionists, which discussed the abolition of slavery and described the slave revolution in Haiti of 1791–1804 and the end of slavery in the British Empire in 1833.
It also allowed slaves to learn that thousands of enslaved individuals had escaped, often with the assistance of the Underground Railroad. Literacy also was believed to make the enslaved unhappy at best, insolent and sullen at worst. As put by prominent Washington lawyer Elias B. Caldwell in 1822: "The more you improve the condition of these people, the more you cultivate their minds, the more miserable you make them, in their present state. You give them a higher relish for those privilegies which they can never attain, and turn what we intend for a blessing [slavery] into a curse. No, if they must remain in their present situation, keep them in the lowest state of degradation and ignorance. The nearer you bring them to the condition of brutes, the better chance do you give them of possessing their apathy."
Unlike in the South, slave owners in Utah were required to send their slaves to school. Black slaves did not have to spend as much time in school as Indian slaves.
Freedom suits and Dred Scott:
Main articles: Dred Scott v. Sandford and Freedom suits
With the development of slave and free states after the American Revolution, and far-flung commercial and military activities, new situations arose in which slaves might be taken by masters into free states. Most free states not only prohibited slavery, but ruled that slaves brought and kept there illegally could be freed.
Such cases were sometimes known as transit cases. Dred Scott and his wife Harriet Scott each sued for freedom in St. Louis after the death of their master, based on their having been held in a free territory (the northern part of the Louisiana Purchase from which slavery was excluded under the terms of the Missouri Compromise). (Later the two cases were combined under Dred Scott's name.)
Scott filed suit for freedom in 1846 and went through two state trials, the first denying and the second granting freedom to the couple (and, by extension, their two daughters, who had also been held illegally in free territories).
For 28 years, Missouri state precedent had generally respected laws of neighboring free states and territories, ruling for freedom in such transit cases where slaves had been held illegally in free territory. But in the Dred Scott case, the Missouri Supreme Court ruled against the slaves.
After Scott and his team appealed the case to the U.S. Supreme Court, Chief Justice Roger B. Taney, in a sweeping decision, denied Scott his freedom. The 1857 decision, decided 7–2, held that a slave:
- did not become free when taken into a free state;
- Congress could not bar slavery from a territory;
- and people of African descent imported into the United States and held as slaves, or their descendants, could never be citizens
- and thus had no status to bring suit in a U.S. court.
A state could not bar slaveowners from bringing slaves into that state. Many Republicans, including Abraham Lincoln, considered the decision unjust and evidence that the Slave Power had seized control of the Supreme Court. Anti-slavery groups were enraged and slave owners encouraged, escalating the tensions that led to civil war.
1850 to the firing on Fort Sumter:
Further information:
- Timeline of events leading to the American Civil War § Compromise of 1850 to the Election of 1860
- Timeline of events leading to the American Civil War § Election of 1860 to the Battle of Fort Sumter
In 1850, Congress passed the Fugitive Slave Act, as part of the Compromise of 1850, which required law enforcement and citizens of free states to cooperate in the capture and return of slaves.
This met with considerable overt and covert resistance in free states and cities such as Philadelphia, New York, and Boston. Refugees from slavery continued to flee the South across the Ohio River and other parts of the Mason–Dixon line dividing North from South, to the North and Canada via the Underground Railroad.
Some white Northerners helped hide former slaves from their former owners or helped them reach freedom in Canada.
As part of the Compromise of 1850, Congress abolished the slave trade (though not the ownership of slaves) in the District of Columbia; fearing this would happen, Alexandria, regional slave trading center and port, successfully sought its removal from the District of Columbia and devolution to Virginia.
After 1854, Republicans argued that the "Slave Power", especially the pro-slavery Democratic Party in the South, controlled two of the three branches of the Federal government.
The abolitionists, realizing that the total elimination of slavery was unrealistic as an immediate goal, worked to prevent the expansion of slavery into the western territories which eventually would be new states. The Missouri Compromise, the Compromise of 1850, and the Bleeding Kansas period dealt with whether new states would be slave or free, or how that was to be decided. Both sides were anxious about effects of these decisions on the balance of power in the Senate.
Main article: Bleeding Kansas:
After the passage of the Kansas–Nebraska Act in 1854, border fighting broke out in the Kansas Territory, where the question of whether it would be admitted to the Union as a slave or free state was left to the inhabitants. Migrants from both free and slave states moved into the territory to prepare for the vote on slavery.
Abolitionist John Brown, the most famous of the anti-slavery immigrants, was active in the fighting in "Bleeding Kansas", but so too were many white Southerners (many from adjacent Missouri) who opposed abolition.
Abraham Lincoln's and the Republicans' political platform in 1860 was to stop slavery's expansion. Historian James M. McPherson says that in his famous "House Divided" speech in 1858, Lincoln said American republicanism can be purified by restricting the further expansion of slavery as the first step to putting it on the road to 'ultimate extinction.' Southerners took Lincoln at his word. When he won the presidency, they left the Union to escape the 'ultimate extinction' of slavery."
Main article: 1860 United States presidential election
The divisions became fully exposed with the 1860 presidential election. The electorate split four ways. The Southern Democrats endorsed slavery, while the Republican Party denounced it. The Northern Democrats said democracy required the people to decide on slavery locally, state by state and territory by territory. The Constitutional Union Party said the survival of the Union was at stake and everything else should be compromised.
Lincoln, the Republican, won with a plurality of popular votes and a majority of electoral votes. Lincoln, however, did not appear on the ballots of ten southern slave states.
Many slave owners in the South feared that the real intent of the Republicans was the abolition of slavery in states where it already existed, and that the sudden emancipation of four million slaves would be disastrous for the slave owners and for the economy that drew its greatest profits from the labor of people who were not paid.
The slave owners feared that ending the balance could lead to the domination of the federal government by the northern free states. This led seven southern states to secede from the Union. When the Confederate Army attacked a U.S. Army installation at Fort Sumter, the American Civil War began and four additional slave states seceded.
Northern leaders had viewed the slavery interests as a threat politically, but with secession, they viewed the prospect of a new Southern nation, the Confederate States of America, with control over the Mississippi River and parts of the West, as politically unacceptable. Most of all, they could not accept this repudiation of American nationalism.
Civil War and emancipation:
Main article: Slavery during the American Civil War
American Civil War:
Main articles:
- Origins of the American Civil War,
- American Civil War,
- Contraband (American Civil War),
- Military history of African Americans in the American Civil War
The consequent American Civil War, beginning in 1861, led to the end of chattel slavery in America. Not long after the war broke out, through a legal maneuver by Union General Benjamin F. Butler, a lawyer by profession, slaves who fled to Union lines were considered "contraband of war".
General Butler ruled that they were not subject to return to Confederate owners as they had been before the war. "Lincoln and his Cabinet discussed the issue on May 30 and decided to support Butler's stance". Soon word spread, and many slaves sought refuge in Union territory, desiring to be declared "contraband".
Many of the "contrabands" joined the Union Army as workers or troops, forming entire regiments of the U.S. Colored Troops. Others went to refugee camps such as the Grand Contraband Camp near Fort Monroe or fled to northern cities. General Butler's interpretation was reinforced when Congress passed the Confiscation Act of 1861, which declared that any property used by the Confederate military, including slaves, could be confiscated by Union forces.
At the beginning of the war, some Union commanders thought they were supposed to return escaped slaves to their masters. By 1862, when it became clear that this would be a long war, the question of what to do about slavery became more general.
The Southern economy and military effort depended on slave labor. It began to seem unreasonable to protect slavery while blockading Southern commerce and destroying Southern production. As Congressman George W. Julian of Indiana put it in an 1862 speech in Congress, the slaves "cannot be neutral. As laborers, if not as soldiers, they will be allies of the rebels, or of the Union."
Julian and his fellow Radical Republicans put pressure on Lincoln to rapidly emancipate the slaves, whereas moderate Republicans came to accept gradual, compensated emancipation and colonization. Copperheads, the border states and War Democrats opposed emancipation, although the border states and War Democrats eventually accepted it as part of total war needed to save the Union.
See also: Confiscation Acts and Act Prohibiting the Return of Slaves
Emancipation Proclamation:
Main article: Emancipation Proclamation
The Emancipation Proclamation was an executive order issued by President Abraham Lincoln on January 1, 1863.
In a single stroke it changed the legal status, as recognized by the U.S. government, of three million slaves in designated areas of the Confederacy from "slave" to "free". It had the practical effect that as soon as a slave escaped the control of his or her owner, by running away or through advances of federal troops, the slave became legally and actually free.
Plantation owners, realizing that emancipation would destroy their economic system, sometimes moved their slaves as far as possible out of reach of the Union army. By June 1865, the Union Army controlled all of the Confederacy and had liberated all of the designated slaves.
In 1861, Lincoln expressed the fear that premature attempts at emancipation would mean the loss of the border states. He believed that "to lose Kentucky is nearly the same as to lose the whole game." At first, Lincoln reversed attempts at emancipation by Secretary of War Simon Cameron and Generals John C. Frémont (in Missouri) and David Hunter (in South Carolina, Georgia and Florida) to keep the loyalty of the border states and the War Democrats.
On July 22, 1862, Lincoln told his cabinet of his plan to issue a preliminary Emancipation Proclamation. Secretary of State William H. Seward advised Lincoln to wait for a victory before issuing the proclamation, as to do otherwise would seem like "our last shriek on the retreat". On September 17, 1862, the Battle of Antietam provided this opportunity, and on September 22, 1862, Lincoln issued his preliminary Emancipation Proclamation, which provided that enslaved people in the states in rebellion against the United States on January 1, 1863, "shall be then, thenceforward, and forever free".
On September 24 and 25, the War Governors' Conference added support for the proclamation.
Lincoln issued his final Emancipation Proclamation on January 1, 1863. In his letter to Hodges, Lincoln explained his belief that "If slavery is not wrong, nothing is wrong ... And yet I have never understood that the Presidency conferred upon me an unrestricted right to act officially upon this judgment and feeling ... I claim not to have controlled events, but confess plainly that events have controlled me."
Lincoln's Emancipation Proclamation promised freedom for slaves in the Confederate states and authorized the enlistment of African Americans in the Union Army. The Emancipation Proclamation did not free slaves in the border states, which were the slaveholding states that that remained in the Union. As a practical matter, the proclamation freed only those slaves who escaped to Union lines. But the proclamation made the abolition of slavery an official war goal and was implemented as the Union took territory from the Confederacy. According to the Census of 1860, this policy would free nearly four million slaves, or over 12% of the total population of the United States.
Because the Emancipation Proclamation was issued under the President's war powers, it might not have continued in force after the war ended. Therefore, Lincoln played a leading role in getting the constitutionally required two-thirds majority of both houses of Congress to vote for the Thirteenth Amendment, which made emancipation universal and permanent.
Enslaved African Americans had not waited for Lincoln before escaping and seeking freedom behind Union lines. From the early years of the war, hundreds of thousands of African Americans escaped to Union lines, especially in Union-controlled areas such as Norfolk and the Hampton Roads region in 1862 Virginia, Tennessee from 1862 on, the line of Sherman's march, etc. So many African Americans fled to Union lines that commanders created camps and schools for them, where both adults and children learned to read and write.
The American Missionary Association entered the war effort by sending teachers south to such contraband camps, for instance, establishing schools in Norfolk and on nearby plantations.
In addition, nearly 200,000 African-American men served with distinction in the Union forces as soldiers and sailors; most were escaped slaves. The Confederacy was outraged by armed black soldiers and refused to treat them as prisoners of war. They murdered many, as at the Fort Pillow massacre, and re-enslaved others.
On February 24, 1863, the Arizona Organic Act abolished slavery in the newly formed Arizona Territory. Tennessee and all of the border states (except Kentucky and Delaware) abolished slavery by early 1865. Thousands of slaves were freed by the operation of the Emancipation Proclamation as Union armies marched across the South. Emancipation came to the remaining Southern slaves after the surrender of all Confederate troops in spring 1865.
In spite of the South's shortage of manpower, until 1865, most Southern leaders opposed arming slaves as soldiers. However, a few Confederates discussed arming slaves. Finally, in early 1865, General Robert E. Lee said that black soldiers were essential, and legislation was passed. The first black units were in training when the war ended in April.
End of slavery:
Main article: End of slavery in the United States of America
Further information:
- Slave states and free states § End of slavery
- Emancipation Day § United States
- Compensated emancipation in the United States
Booker T. Washington remembered Emancipation Day in early 1863, when he was a boy of nine in Virginia: "As the great day drew nearer, there was more singing in the slave quarters than usual. It was bolder, had more ring, and lasted later into the night. Most of the verses of the plantation songs had some reference to freedom. ... Some man who seemed to be a stranger (a United States officer, I presume) made a little speech and then read a rather long paper – the Emancipation Proclamation, I think. After the reading we were told that we were all free, and could go when and where we pleased. My mother, who was standing by my side, leaned over and kissed her children, while tears of joy ran down her cheeks. She explained to us what it all meant, that this was the day for which she had been so long praying, but fearing that she would never live to see."
The war ended on June 22, 1865, and following that surrender, the Emancipation Proclamation was enforced throughout remaining regions of the South that had not yet freed the slaves. Slavery officially continued for a couple of months in other locations. Federal troops arrived in Galveston, Texas, on June 19, 1865, to enforce the emancipation. The commemoration of that event, Juneteenth National Independence Day, has been declared a national holiday in 2021.
The Thirteenth Amendment, abolishing slavery except as punishment for a crime, had been passed by the Senate in April 1864, and by the House of Representatives in January 1865. The amendment did not take effect until it was ratified by three-fourths of the states, which occurred on December 6, 1865, when Georgia ratified it. On that date, the last 40,000–45,000 enslaved Americans in the remaining two slave states of Kentucky and Delaware, as well as the 200 or so perpetual apprentices in New Jersey left from the very gradual emancipation process begun in 1804, were freed.
Reconstruction to the present:
See also: History of unfree labor in the United States and History of civil rights in the United States
Further information:
Journalist Douglas A. Blackmon reported in his Pulitzer Prize-winning book Slavery By Another Name that many black persons were virtually enslaved under convict leasing programs, which started after the Civil War. Most Southern states had no prisons; they leased convicts to businesses and farms for their labor, and the lessee paid for food and board. Incentives for abuse were present.
The continued involuntary servitude took various forms, but the primary forms included convict leasing, peonage and sharecropping, with the latter eventually encompassing poor whites as well.
By the 1930s, whites constituted most of the sharecroppers in the South. Mechanization of agriculture had reduced the need for farm labor, and many black people left the South in the Great Migration. Jurisdictions and states created fines and sentences for a wide variety of minor crimes and used these as an excuse to arrest and sentence black people.
Under convict-leasing programs, African-American men, often guilty of petty crimes or even no crime at all, were arrested, compelled to work without pay, repeatedly bought and sold, and coerced to do the bidding of the leaseholder.
Sharecropping, as it was practiced during this period, often involved severe restrictions on the freedom of movement of sharecroppers, who could be whipped for leaving the plantation.
Both sharecropping and convict leasing were legal and tolerated by both the North and South. However, peonage was an illicit form of forced labor. Its existence was ignored by authorities while thousands of African Americans and poor white Americans were subjugated and held in bondage until the mid-1960s to the late 1970s.
With the exception of cases of peonage, beyond the period of Reconstruction, the federal government took almost no action to enforce the 13th Amendment until December 1941, when President Franklin Delano Roosevelt summoned his attorney general.
Five days after the attack on Pearl Harbor, at the request of the President, Attorney General Francis Biddle issued Circular No. 3591 to all federal prosecutors, instructing them to investigate actively and try any case of involuntary servitude or slavery. Several months later, convict leasing was officially abolished. But aspects have persisted in other forms.
Historians argue that other systems of penal labor were all created in 1865, and convict leasing was simply the most oppressive form. Over time, a large civil rights movement arose to bring full civil rights and equality under the law to all Americans.
Convict leasing:
Main articles:
With emancipation a legal reality, white Southerners were concerned with both controlling the newly freed slaves and keeping them in the labor force at the lowest level. The system of convict leasing began during Reconstruction and was fully implemented in the 1880s, officially ending in the last state, Alabama, in 1928.
It persisted in various forms until it was abolished in 1942 by President Franklin D. Roosevelt during World War II, several months after the attack on Pearl Harbor involved the U.S. in the conflict. This system allowed private contractors to purchase the services of convicts from the state or local governments for a specific time period. African Americans, due to "vigorous and selective enforcement of laws and discriminatory sentencing", made up the vast majority of the convicts leased.
Writer Douglas A. Blackmon writes of the system: "It was a form of bondage distinctly different from that of the antebellum South in that for most men, and the relatively few women drawn in, this slavery did not last a lifetime and did not automatically extend from one generation to the next. But it was nonetheless slavery – a system in which armies of free men, guilty of no crimes and entitled by law to freedom, were compelled to labor without compensation, were repeatedly bought and sold, and were forced to do the bidding of white masters through the regular application of extraordinary physical coercion".
The constitutional basis for convict leasing is that the Thirteenth Amendment, while abolishing slavery and involuntary servitude generally, expressly permits it as a punishment for crime.
Educational issues:
Historian Mark Summers Wahlgren notes that the estimated literacy rate among formerly enslaved southern blacks at the time of emancipation was five to 10 percent, but had reached a baseline of 40 to 50 percent (and higher in cities) by the turn of the century, representing a "great advance." As W. E. B. Du Bois noted, the black colleges were not perfect, but "in a single generation they put thirty thousand black teachers in the South" and "wiped out the illiteracy of the majority of black people in the land".
Northern philanthropists continued to support black education in the 20th century, for example of a major donor to Hampton Institute and Tuskegee was George Eastman, who also helped fund health programs at colleges and in communities.
Apologies:
Main article: Public apologies for slavery in the United States
In the 21st century, various legislative bodies have issued public apologies for slavery in the United States.
Political legacy:
A 2016 study, published in The Journal of Politics, finds that "[w]hites who currently live in Southern counties that had high shares of slaves in 1860 are more likely to identify as a Republican, oppose affirmative action, and express racial resentment and colder feelings toward blacks."
The study contends that "contemporary differences in political attitudes across counties in the American South in part trace their origins to slavery's prevalence more than 150 years ago."
The authors argue that their findings are consistent with the theory that "following the Civil War, Southern whites faced political and economic incentives to reinforce existing racist norms and institutions to maintain control over the newly freed African American population. This amplified local differences in racially conservative political attitudes, which in turn have been passed down locally across generations."
A 2017 study in the British Journal of Political Science argued that the British American colonies without slavery adopted better democratic institutions to attract migrant workers to their colonies.
An article published in the Journal of Economic History in 2022 finds that former slave owners remained politically dominant long after the abolition of slavery. Using data from Texas, the authors find that "[i]n 1900, still around 50 percent of all state legislators came from a slave-owning background."
Economics:
Robert Fogel and Stanley Engerman, in their 1974 book Time on the Cross, argued that the rate of return of slavery at the market price was close to ten percent, a number close to investment in other assets.
The transition from indentured servants to slaves is cited to show that slaves offered greater profits to their owners. A qualified consensus among economic historians and economists is that "Slave agriculture was efficient compared with free agriculture. Economies of scale, effective management, and intensive utilization of labor and capital made southern slave agriculture considerably more efficient than nonslave southern farming", and it is the near-universal consensus among economic historians and economists that slavery was not "a system irrationally kept in existence by plantation owners who failed to perceive or were indifferent to their best economic interests".
The relative price of slaves and indentured servants in the antebellum period did decrease. Indentured servants became more costly with the increase in the demand of skilled labor in England. At the same time, slaves were mostly supplied from within the United States and thus language was not a barrier, and the cost of transporting slaves from one state to another was relatively low.
However, as in Brazil and Europe, slavery at its end in the United States tended to be concentrated in the poorest regions of the United States, with a qualified consensus among economists and economic historians concluding that the "modern period of the South's economic convergence to the level of the North only began in earnest when the institutional foundations of the southern regional labor market were undermined, largely by federal farm and labor legislation dating from the 1930s."
In the decades preceding the Civil War, the black population of the United States experienced a rapid natural increase. Unlike the trans-Saharan slave trade with Africa, the slave population transported by the Atlantic slave trade to the United States was sex-balanced.
The slave population multiplied nearly fourfold between 1810 and 1860, despite the passage of the Act Prohibiting Importation of Slaves signed into law by President Thomas Jefferson in 1807 banning the international slave trade. Thus, it is also the universal consensus among modern economic historians and economists that slavery in the United States was not "economically moribund on the eve of the Civil War".
In the 2010s, several historians, among them Edward E. Baptist, Sven Beckert, Walter Johnson and Calvin Schermerhorn, have posited that slavery was integral in the development of American capitalism.
Johnson wrote in River of Dark Dreams (2013): "The cords of credit and debt—of advance and obligation—that cinched the Atlantic economy together were anchored with the mutually defining values of land and slaves: without land and slaves, there was no credit, and without slaves, land itself was valueless. Promises made in the Mississippi Valley were backed by the value of slaves and fulfilled in their labor."
Other economic historians have rejected that thesis.
A 2023 study estimates that prior to the onset of the US Civil War, the enslaved population produced 12.6% of US national product.
Efficiency of slaves:
Scholars disagree on how to quantify the efficiency of slavery. In Time on the Cross Fogel and Engerman equate efficiency to total factor productivity (TFP), the output per average unit of input on a farm.
Using this measurement, Southern farms that enslaved black people using the gang system were 35% more efficient than Northern farms, which used free labor. Under the gang system, groups of slaves perform synchronized tasks under the constant vigilance of an overseer. Each group was like a part of a machine. If perceived to be working below his capacity, a slave could be punished.
Fogel argues that this kind of negative enforcement was not frequent and that slaves and free laborers had a similar quality of life; however, there is controversy on this last point. A critique of Fogel and Engerman's view was published by Paul A. David in 1976.
In 1995, a random survey of 178 members of the Economic History Association sought to study the views of economists and economic historians on the debate. The study found that 72 percent of economists and 65 percent of economic historians would generally agree that "Slave agriculture was efficient compared with free agriculture. Economies of scale, effective management, and intensive utilization of labor and capital made southern slave agriculture considerably more efficient than nonslave southern farming."
48 percent of the economists agreed without provisos, while 24 percent agreed when provisos were included in the statement. On the other hand, 58 percent of economic historians and 42 percent of economists disagreed with Fogel and Engerman's "proposition that the material (not psychological) conditions of the lives of slaves compared favorably with those of free industrial workers in the decades before the Civil War"
Prices of slaves:
The U.S. has a capitalist economy so the price of slaves was determined by the law of supply and demand. For example, following bans on the import of slaves after the UK's Slave Trade Act 1807 and the American 1807 Act Prohibiting Importation of Slaves, the prices for slaves increased.
The markets for the products produced by slaves also affected the price of slaves (e.g. the price of slaves fell when the price of cotton fell in 1840). Anticipation of slavery's abolition also influenced prices. During the Civil War the price for slave men in New Orleans dropped from $1,381 in 1861 to $1,116 by 1862 (the city was captured by U.S. forces in the Spring of 1862).
Controlling for inflation, prices of slaves rose dramatically in the six decades prior to the Civil War, reflecting demand due to commodity cotton, as well as use of slaves in shipping and manufacturing.
Although the prices of slaves relative to indentured servants declined, both got more expensive. Cotton production was rising and relied on the use of slaves to yield high profits. Fogel and Engeman initially argued that if the Civil War had not happened, the slave prices would have increased even more, an average of more than fifty percent by 1890.
Prices reflected the characteristics of the slave; such factors as sex, age, nature, and height were all taken into account to determine the price of a slave. Over the life-cycle, the price of enslaved women was higher than their male counterparts up to puberty age, as they would likely bear children who their masters could sell as slaves and could be used as slave laborers.
Men around the age of 25 were the most valued, as they were at the highest level of productivity and still had a considerable life-span. If slaves had a history of fights or escapes, their price was lowered reflecting what planters believed was risk of repeating such behavior.
Slave traders and buyers would examine a slave's back for whipping scars; a large number of injuries would be seen as evidence of laziness or rebelliousness, rather than the previous master's brutality, and would lower the slave's price. Taller male slaves were priced at a higher level, as height was viewed as a proxy for fitness and productivity.
Effects on Southern economic development:
While slavery brought profits in the short run, discussion continues on the economic benefits of slavery in the long run. In 1995, a random anonymous survey of 178 members of the Economic History Association found that out of the forty propositions about American economic history that were surveyed, the group of propositions most disputed by economic historians and economists were those about the postbellum economy of the American South (along with the Great Depression).
The only exception was the proposition initially put forward by historian Gavin Wright that the "modern period of the South's economic convergence to the level of the North only began in earnest when the institutional foundations of the southern regional labor market were undermined, largely by federal farm and labor legislation dating from the 1930s."
62 percent of economists (24 percent with and 38 percent without provisos) and 73 percent of historians (23 percent with and 50 percent without provisos) agreed with this statement. Wright has also argued that the private investment of monetary resources in the cotton industry, among others, delayed development in the South of commercial and industrial institutions. There was little public investment in railroads or other infrastructure.
Wright argues that agricultural technology was far more developed in the South, representing an economic advantage of the South over the North of the United States.
In Democracy in America, Alexis de Tocqueville noted that "the colonies in which there were no slaves became more populous and more rich than those in which slavery flourished".
In 1857, in The Impending Crisis of the South: How to Meet It, Hinton Rowan Helper made the same point. Economists Peter H. Lindert and Jeffrey G. Williamson, in a pair of articles published in 2012 and 2013, found that, despite the American South initially having per capita income roughly double that of the North in 1774, incomes in the South had declined 27% by 1800 and continued to decline over the next four decades, while the economies in New England and the Mid-Atlantic states vastly expanded.
By 1840, per capita income in the South was well behind the Northeast and the national average (Note: this is also true in the early 21st century).
Lindert and Williamson argue that this antebellum period is an example of what economists Daron Acemoglu, Simon Johnson, and James A. Robinson call "a reversal of fortune". In his essay "The Real History of Slavery", economist Thomas Sowell reiterated and augmented the observation made by de Tocqueville by comparing slavery in the United States to slavery in Brazil. He notes that slave societies reflected similar economic trends in those and other parts of the world, suggesting that the trend Lindert and Williamson identify may have continued until the American Civil War: "Both in Brazil and in the United States – the countries with the two largest slave populations in the Western Hemisphere – the end of slavery found the regions in which slaves had been concentrated poorer than other regions of these same countries.
For the United States, a case could be made that this was due to the Civil War, which did so much damage to the South, but no such explanation would apply to Brazil, which fought no Civil War over this issue. Moreover, even in the United States, the South lagged behind the North in many ways even before the Civil War.
Although slavery in Europe died out before it was abolished in the Western Hemisphere, as late as 1776 slavery had not yet died out all across the continent when Adam Smith wrote in The Wealth of Nations that it still existed in some eastern regions.
But, even then, Eastern Europe was much poorer than Western Europe. The slavery of North Africa and the Middle East, over the centuries, took more slaves from sub-Saharan Africa than the Western Hemisphere did ... But these remained largely poor countries until the discovery and extraction of their vast oil deposits."
Sowell also notes in Ethnic America: A History, citing historians Clement Eaton and Eugene Genovese, that three-quarters of Southern white families owned no slaves at all. Most slaveholders lived on farms rather than plantations, and few plantations were as large as the fictional ones depicted in Gone with the Wind.
In "The Real History of Slavery," Sowell also notes in comparison to slavery in the Arab world and the Middle East (where slaves were seldom used for productive purposes) and China (where the slaves consumed the entire output they created), Sowell observes that many commercial slaveowners in the antebellum South tended to be spendthrift and many lost their plantations due to creditor foreclosures, and in Britain, profits by British slave traders only amounted to two percent of British domestic investment at the height of the Atlantic slave trade in the 18th century.
Sowell draws the following conclusion regarding the macroeconomic value of slavery:
"In short, even though some individual slaveowners grew rich and some family fortunes were founded on the exploitation of slaves, that is very different from saying that the whole society, or even its non-slave population as a whole, was more economically advanced than it would have been in the absence of slavery. What this means is that, whether employed as domestic servants or producing crops or other goods, millions suffered exploitation and dehumanization for no higher purpose than the ... aggrandizement of slaveowners."
Eric Hilt noted that, while some historians have suggested slavery was necessary for the Industrial Revolution (on the grounds that American slave plantations produced most of the raw cotton for the British textiles market and the British textiles market was the vanguard of the Industrial Revolution), it is not clear if this is actually true; there is no evidence that cotton could not have been mass-produced by yeoman farmers rather than slave plantations if the latter had not existed (as their existence tended to force yeoman farmers into subsistence farming) and there is some evidence that they certainly could have.
The soil and climate of the American South were excellent for growing cotton, so it is not unreasonable to postulate that farms without slaves could have produced substantial amounts of cotton; even if they did not produce as much as the plantations did, it could still have been enough to serve the demand of British producers. Similar arguments have been made by other historians.
Sexual economy of American slavery:
Scholar Adrienne Davis articulates how the economics of slavery also can be defined as a sexual economy, specifically focusing on how black women were expected to perform physical, sexual and reproductive labor to provide a consistent enslaved workforce and increase the profits of white slavers.
Davis writes that black women were needed for their "sexual and reproductive labor to satisfy the economic, political, and personal interest of white men of the elite class" articulating that black women's reproductive capacity was important in the maintenance of the system of slavery due to its ability to perpetuate an enslaved workforce.
She is also drawing attention to black women's labor being needed to maintain the aristocracy of a white ruling class, due to the intimate nature of reproduction and its potential for producing more enslaved peoples.
Due to the institution of partus sequitur ventrem, black women's wombs became the site where slavery was developed and transferred, meaning that black women were not only used for their physical labor, but for their sexual and reproductive labor as well.
"The rule that the children's status follows their mothers' was a foundational one for our economy. It converted enslaved women's reproductive capacity into market capital"
This articulation by Davis illustrates how black women's reproductive capacity was commodified under slavery, and that an analysis of the economic structures of slavery requires an acknowledgment of how pivotal black women's sexuality was in maintaining slavery's economic power.
Davis writes how black women performed labor under slavery, writing: "[black women were] male when convenient and horrifically female when needed". The fluctuating expectations of black women's gendered labor under slavery disrupted the white normative roles that were assigned to white men and white women.
This ungendering black women received under slavery contributed to the systemic dehumanization experienced by enslaved black women, as they were unable to receive the expectations or experiences of either gender within the white binary.
Davis' arguments address the fact that, under slavery, black women's sexuality became linked to the economic and public sphere, making their intimate lives into public institutions. Black women's physical labor was gendered as masculine under slavery when they were needed to yield more profit, but their reproductive capacities and sexual labor was equally as important in maintaining white power over black communities and perpetuating an enslaved workforce.
Geography and demography:
Slave importation:
About 600,000 slaves were transported to the United States, or five percent of the 12 million slaves taken from Africa. About 310,000 of these persons were imported into the Thirteen Colonies before 1776: 40 percent directly, and the rest from the Caribbean.
The great majority of enslaved Africans were transported to sugar plantations in the Caribbean and to Portuguese Brazil. As life expectancy was short, their numbers had to be continually replenished. Life expectancy was much higher in the United States, and the enslaved population was successful in reproduction, which was called "natural increase" by enslavers.
The population of enslaved people in the United States grew to 4 million by the 1860 census. Historian J. David Hacker conducted research which estimated that the cumulative number of slaves in colonial America and the United States (1619-1865) was 10 million.
Below Chart: Origins of American slaves
Below: Distribution of Slaves:
Below: Total Slave Population in U.S.: 1790-1860 by State and Territory:
For various reasons, the census did not always include all of the slaves, especially in the West. California was admitted as a free state and reported no slaves. However, there were many slaves that were brought to work in the mines during the California Gold Rush.
Some Californian communities openly tolerated slavery, such as San Bernardino, which was mostly made up of transplants from the neighboring slave territory of Utah.
New Mexico Territory never reported any slaves on the census, yet sued the government for compensation for 600 slaves that were freed when Congress outlawed slavery in the territory.
Utah was actively trying to hide its slave population from Congress and did not report slaves in several communities.
Additionally, the census did not traditionally include Native Americans, and hence did not include Native American slaves or Native African slaves owned by Native Americans. There were hundreds of Native American slaves in California, Utah and New Mexico that were never recorded in the census.
Distribution of slaveholders:
As of the 1860 Census, one may compute the following statistics on slaveholding:
Some Californian communities openly tolerated slavery, such as San Bernardino, which was mostly made up of transplants from the neighboring slave territory of Utah.
New Mexico Territory never reported any slaves on the census, yet sued the government for compensation for 600 slaves that were freed when Congress outlawed slavery in the territory.
Utah was actively trying to hide its slave population from Congress and did not report slaves in several communities.
Additionally, the census did not traditionally include Native Americans, and hence did not include Native American slaves or Native African slaves owned by Native Americans. There were hundreds of Native American slaves in California, Utah and New Mexico that were never recorded in the census.
Distribution of slaveholders:
As of the 1860 Census, one may compute the following statistics on slaveholding:
- Enumerating slave schedules by county, 393,975 named persons held 3,950,546 unnamed slaves, for an average of about ten slaves per holder. As some large holders held slaves in multiple counties and are thus multiply counted, this slightly overestimates the number of slaveholders.
- Excluding slaves, the 1860 U.S. population was 27,167,529; therefore, approximately 1.45% of free persons (roughly one in 69) was a named slaveholder (393,975 named slaveholders among 27,167,529 free persons). By counting only named slaveholders, this approach does not acknowledge people who benefited from slavery by being in a slaveowning household, e.g., the wife and children of an owner; in 1850, there was an average of 5.55 people per household, so on average, around 8.05% of free persons lived in a slave-owning household. In the South, 33% of families owned at least one slave. According to historian Joseph Glatthaar, the number of soldiers of the Confederacy's Army of Northern Virginia who either owned slaves or came from slave owning households is "almost one of every two 1861 recruits". In addition he notes that, "Untold numbers of enlistees rented land from, sold crops to, or worked for slaveholders. In the final tabulation, the vast majority of the volunteers of 1861 had a direct connection to slavery."
- It is estimated by the transcriber Tom Blake, that holders of 200 or more slaves, constituting less than 1% of all U.S. slaveholders (fewer than 4,000 persons, one in 7,000 free persons, or 0.015% of the population) held an estimated 20–30% of all slaves (800,000 to 1,200,000 slaves). Nineteen holders of 500 or more slaves have been identified. The largest slaveholder was Joshua John Ward, of Georgetown, South Carolina, who in 1850 held 1,092 slaves, and whose heirs in 1860 held 1,130 or 1,131 slaves – he was dubbed "the king of the rice planters", and one of his plantations is now part of Brookgreen Gardens.
- The percentage of families that owned slaves in 1860 in various groupings of states was as follows:
Historiography:
Main article: Historiography of the United States § Slavery and Black history
The historian Peter Kolchin, writing in 1993, noted that until the latter decades of the 20th century, historians of slavery had primarily concerned themselves with the culture, practices and economics of the slaveholders, not with the slaves.
This was in part due to the circumstance that most slaveholders were literate and left behind written records, whereas slaves were largely illiterate and not in a position to leave written records. Scholars differed as to whether slavery should be considered a benign or a "harshly exploitive" institution.
Much of the history written prior to the 1950s had a distinctive racist slant to it. By the 1970s and 1980s, historians were using archaeological records, black folklore and statistical data to develop a much more detailed and nuanced picture of slave life. Individuals were shown to have been resilient and somewhat autonomous in many of their activities, within the limits of their situation and despite its precariousness.
Historians who wrote in this era include:
- John Blassingame (Slave Community),
- Eugene Genovese (Roll, Jordan, Roll),
- Leslie Howard Owens (This Species of Property),
- and Herbert Gutman (The Black Family in Slavery and Freedom).
See also: Bibliography of slavery in the United States
See also
- White Supremacy, including the Ku Klux Klan (KKK)
- Jim Crow Laws
- Reconstruction Amendments
- How Slavery Became the Economic Engine of the South (History.com)
- Abolition of slavery timeline
- American Descendants of Slavery (ADOS)
- Glossary of American slavery
- Historiography of the United States § Slavery and Black history
- List of slave owners
- Lists of United States public officials who owned slaves
- Category:Slave owners killed in the American Civil War
- Reparations for slavery debate in the United States
- Slave insurance in the United States
- Slave narrative § North American slave narratives
- Slavery and Slaving in World History: A Bibliography
- Slavery at American colleges and universities
- Indian removal
- Triangular trade
- Slavery in the Spanish New World colonies
- Slavery in the British and French Caribbean
- Slavery in Cuba
- Slavery in Brazil
- Slavery in Latin America
- Slavery in America: A Resource Guide, Library of Congress
- "Born in Slavery: Slave Narratives from the Federal Writers' Project, 1936 to 1938", Library of Congress
- "Voices Remembering Slavery: Freed People Tell Their Stories", audio interviews of former slaves, 1932–1975, Library of Congress
- Digital Library on American Slavery at University of North Carolina at Greensboro
- "Slavery and the Making of America", WNET (4-part series)
- Slavery in America at the History Channel
- "Slavery in the United States", Economic History Encyclopedia, March 26, 2008
- North American Slave Narratives, Documenting the American South, Louis Round Wilson Library
- The Trans-Atlantic Slave Trade Database has information on almost 36,000 slaving voyages
- 1850: New Orleans woman and child she held in slavery
- American Capitalism Is Brutal. You Can Trace That to the Plantation. The New York Times Magazine. August 14, 2019.
Underground Railroad
- YouTube Video The Secret History of The Underground Railroad
- YouTube Video: The Underground Railroad: On the Road to Freedom
- YouTube Video: The breathtaking courage of Harriet Tubman - Janell Hobson
- UL: Entrance to the Underground Railroad.
- UR: In the novel, Colson Whitehead reconceives the Underground Railroad as a literal subterranean track with hidden stations and steam engines running along it.
- LL: Quaker abolitionist Levi Coffin and his wife Catherine helped more than 2,000 enslaved people escape to freedom
- LR: Harriet Tubman (photo H. B. Lindsley), c. 1870. A worker on the Underground Railroad, Tubman made 13 trips to the South, helping to free over 70 people. She led people to the Northern free states and Canada. This helped Harriet Tubman gain the name "Moses of Her People"
Underground Railroad
The Underground Railroad was a network of secret routes and safe houses established in the United States during the early to mid-19th century. It was used by enslaved African Americans primarily to escape into free states and from there to Canada.
The network, primarily the work of free African Americans, was assisted by abolitionists and others sympathetic to the cause of the escapees. The slaves who risked capture and those who aided them are also collectively referred to as the passengers and conductors of the "Underground Railroad".
Various other routes led to Mexico, where slavery had been abolished, and to islands in the Caribbean that were not part of the slave trade. An earlier escape route running south toward Florida, then a Spanish possession (except 1763–1783), existed from the late 17th century until approximately 1790.
However, the network now generally known as the Underground Railroad began in the late 18th century. It ran north and grew steadily until the Emancipation Proclamation was signed by President Abraham Lincoln. One estimate suggests that, by 1850, approximately 100,000 slaves had escaped to freedom via the network.
Origin of the name:
Eric Foner wrote that the term "was perhaps first used by a Washington newspaper in 1839, quoting a young slave hoping to escape bondage via a railroad that 'went underground all the way to Boston'". Dr. Robert Clemens Smedley wrote that following slave catchers' failed searches and lost traces of fugitives as far north as Columbia, Pennsylvania, they declared in bewilderment that "there must be an underground railroad somewhere," giving origin to the term.
Scott Shane wrote that the first documented use of the term was in an article written by Thomas Smallwood in the August 10, 1842, edition of Tocsin of Liberty, an abolitionist newspaper published in Albany. He also wrote that the 1879 book Sketches in the History of the Underground Railroad said the phrase was mentioned in an 1839 Washington newspaper article and that the book's author said 40 years later that he had quoted the article from memory as closely as he could.
Political background:
For the fugitive slaves who "rode" the Underground Railroad, many of them considered Canada their final destination. An estimated 30,000 to 40,000 of them settled in Canada, half of whom came between 1850 and 1860. Others settled in free states in the north.
Thousands of court cases for fugitive slaves were recorded between the Revolutionary War and the Civil War. Under the original Fugitive Slave Act of 1793, officials from free states were required to assist slaveholders or their agents who recaptured fugitives, but some state legislatures prohibited this. The law made it easier for slaveholders and slave catchers to capture African Americans and return them to slavery, and in some cases allowed them to enslave free blacks. It also created an eagerness among abolitionists to help enslaved people, resulting in the growth of anti-slavery societies and the Underground Railroad.
With heavy lobbying by Southern politicians, the Compromise of 1850 was passed by Congress after the Mexican–American War. It included a more stringent Fugitive Slave Law; ostensibly, the compromise addressed regional problems by compelling officials of free states to assist slave catchers, granting them immunity to operate in free states. Because the law required sparse documentation to claim a person was a fugitive, slave catchers also kidnapped free blacks, especially children, and sold them into slavery.
Southern politicians often exaggerated the number of escaped slaves and often blamed these escapes on Northerners interfering with Southern property rights. The law deprived people suspected of being slaves of the right to defend themselves in court, making it difficult to prove free status. Some Northern states enacted personal liberty laws that made it illegal for public officials to capture or imprison former slaves.
The perception that Northern states ignored the fugitive slave laws and regulations was a major justification offered for secession.
Routes:
Underground Railroad routes went north to free states and Canada, to the Caribbean, into United States western territories, and Indian territories. Some fugitive slaves traveled south into Mexico for their freedom. Many escaped by sea, including Ona Judge, who had been enslaved by President George Washington.
North to free states and Canada:
Structure:
Further information: Quakers in the abolition movement
Despite the thoroughfare's name, the escape network was neither literally underground nor a railroad. (The first literal underground railroad did not exist until 1863.) According to John Rankin, "It was so called because they who took passage on it disappeared from public view as really as if they had gone into the ground. After the fugitive slaves entered a depot on that road no trace of them could be found. They were secretly passed from one depot to another until they arrived at a destination where they were able to remain free."
It was known as a railroad, using rail terminology such as stations and conductors, because that was the transportation system in use at the time.
The Underground Railroad did not have a headquarters or governing body, nor were there published guides, maps, pamphlets, or even newspaper articles. The Underground Railroad consisted of meeting points, secret routes, transportation, and safe houses, all of them maintained by abolitionist sympathizers and communicated by word of mouth, although there is also a report of a numeric code used to encrypt messages.
Participants generally organized in small, independent groups; this helped to maintain secrecy. People escaping enslavement would move north along the route from one way station to the next. "Conductors" on the railroad came from various backgrounds and included free-born blacks, white abolitionists, the formerly enslaved (either escaped or manumitted), and Native Americans.
Believing that slavery was "contrary to the ethics of Jesus", Christian congregations and clergy played a role, especially the following:
The role of free blacks was crucial; without it, there would have been almost no chance for fugitives from slavery to reach freedom safely. The groups of underground railroad "agents" worked in organizations known as vigilance committees.
Routes:
The Underground Railroad benefited greatly from the geography of the U.S.–Canada border: Michigan, Ohio, Pennsylvania and most of New York were separated from Canada by water, over which transport was usually easy to arrange and relatively safe.
The main route for freedom seekers from the South led up the Appalachians, Harriet Tubman going via Harpers Ferry, through the highly anti-slavery Western Reserve region of northeastern Ohio to the vast shore of Lake Erie, and then to Canada by boat.
A smaller number, traveling by way of New York or New England, went:
The western route, used by John Brown among others, led from Missouri west to free Kansas and north to free Iowa, then east via Chicago to the Detroit River.
Terminology:
Members of the Underground Railroad often used specific terms, based on the metaphor of the railway. For example:
The Big Dipper (whose "bowl" points to the North Star) was known as the drinkin' gourd.
The Railroad was often known as the "freedom train" or "Gospel train", which headed towards "Heaven" or "the Promised Land", i.e., Canada.
William Still, sometimes called "The Father of the Underground Railroad", helped hundreds of slaves escape (as many as 60 a month), sometimes hiding them in his Philadelphia home. He kept careful records, including short biographies of the people, that contained frequent railway metaphors. He maintained correspondence with many of them, often acting as a middleman in communications between people who had escaped slavery and those left behind.
He later published these accounts in the book The Underground Railroad: Authentic Narratives and First-Hand Accounts (1872), a valuable resource for historians to understand how the system worked and learn about individual ingenuity in escapes.
According to Still, messages were often encoded so that they could be understood only by those active in the railroad. For example, the following message, "I have sent via at two o'clock four large hams and two small hams", indicated that four adults and two children were sent by train from Harrisburg to Philadelphia.
The additional word via indicated that the "passengers" were not sent on the usual train, but rather via Reading, Pennsylvania. In this case, the authorities were tricked into going to the regular location (station) in an attempt to intercept the runaways, while Still met them at the correct station and guided them to safety. They eventually escaped either further north or to Canada, where slavery had been abolished during the 1830s.
To reduce the risk of infiltration, many people associated with the Underground Railroad knew only their part of the operation and not of the whole scheme. "Conductors" led or transported the "passengers" from station to station. A conductor sometimes pretended to be enslaved to enter a plantation.
Once a part of a plantation, the conductor would direct the runaways to the North. Enslaved people traveled at night, about 10–20 miles (16–32 km) to each station. They rested, and then a message was sent to the next station to let the station master know the escapees were on their way. They would stop at the so-called "stations" or "depots" during the day and rest. The stations were often located in basements, barns, churches, or in hiding places in caves.
The resting spots where the freedom seekers could sleep and eat were given the code names "stations" and "depots", which were held by "station masters". "Stockholders" gave money or supplies for assistance. Using biblical references, fugitives referred to Canada as the "Promised Land" or "Heaven" and the Ohio River, which marked the boundary between slave states and free states, as the "River Jordan".
The majority of freedom seekers that escaped from slavery did not have help from an abolitionist. Although there are stories of black and white abolitionists helping freedom seekers escape from slavery many escapes were unaided.
Other Underground Railroad escape routes for freedom seekers were maroon communities. Maroon communities were hidden places, such as wetlands or marshes, where escaped slaves established their own independent communities. Examples of maroon communities in the United States include the Great Dismal Swamp in Virginia and Black Seminole communities in Florida, among others others.
Traveling conditions:
Although the freedom seekers sometimes traveled on boat or train, they usually traveled on foot or by wagon, sometimes lying down, covered with hay or similar products, in groups of one to three escapees. Some groups were considerably larger. Abolitionist Charles Turner Torrey and his colleagues rented horses and wagons and often transported as many as 15 or 20 people at a time.
Free and enslaved black men occupied as mariners (sailors) helped enslaved people escape from slavery by providing a ride on their ship, providing information on the safest and best escape routes, and safe locations on land, and locations of trusted people for assistance.
Enslaved African-American mariners had information about slave revolts occurring in the Caribbean, and relayed this news to enslaved people they had contact with in American ports. Free and enslaved African-American mariners assisted Harriet Tubman in her rescue missions. Black mariners provided to her information about the best escape routes, and helped her on her rescue missions.
Routes were often purposely indirect to confuse pursuers. Most escapes were by individuals or small groups; occasionally, there were mass escapes, such as with the Pearl incident. The journey was often considered particularly difficult and dangerous for women or children.
Children were sometimes hard to keep quiet or were unable to keep up with a group. In addition, enslaved women were rarely allowed to leave the plantation, making it harder for them to escape in the same ways that men could. Although escaping was harder for women, some women were successful.
One of the most famous and successful conductors (people who secretly traveled into slave states to rescue those seeking freedom) was Harriet Tubman, a woman who escaped slavery.
Due to the risk of discovery, information about routes and safe havens was passed along by word of mouth, although in 1896 there is a reference to a numerical code used to encrypt messages. Southern newspapers of the day were often filled with pages of notices soliciting information about fugitive slaves and offering sizable rewards for their capture and return. Federal marshals and professional bounty hunters known as slave catchers pursued freedom seekers as far as the Canada–U.S. border.
"Reverse Underground Railroad":
Freedom seekers were not the only black people at risk from slave catchers. With demand for slaves high in the Deep South as cotton was planted, strong, healthy blacks in their prime working and reproductive years were seen and treated as highly valuable commodities.
Both former slaves and free blacks were sometimes kidnapped and sold into slavery, as in the well-documented case of Solomon Northup, a New York-born free black who was kidnapped by Southern slavers while visiting Washington, DC. "Certificates of Freedom," signed, notarized statements attesting to the free status of individual Blacks also known as free papers, could easily be destroyed or stolen, so they provided little protection.
Some buildings, such as the Crenshaw House in far-southeastern Illinois, are known sites where free blacks were sold into slavery, known as the "Reverse Underground Railroad".
Fugitive Slave Act of 1850:
Under the terms of the Fugitive Slave Act of 1850, when suspected fugitives were seized and brought to a special magistrate known as a commissioner, they had no right to a jury trial and could not testify on their own behalf. Technically, they were not accused of a crime. The marshal or private slave-catcher needed only to swear an oath to acquire a writ of replevin for the return of property.
Congress was dominated by Southern congressmen because the population of their states was bolstered by the inclusion of three-fifths of the number of slaves in population totals. They passed the Fugitive Slave Law of 1850 because of frustration at having fugitives from slavery helped by the public and even official institutions outside the South. In some parts of the North, slave-catchers needed police protection.
Arrival in Canada:
See also: American immigration to Canada and Slavery in Canada
British North America (present-day Canada) was a desirable destination, as its long border gave many points of access, it was farther from slave catchers, and beyond the reach of the United States' Fugitive Slave Acts. Further, slavery ended decades earlier in Canada than in the United States.
Britain banned the institution of slavery in present-day Canada (and in most British colonies) in 1833, though the practice of slavery in Canada had effectively ended already early in the 19th century through case law, due to court decisions resulting from litigation on behalf of slaves seeking manumission.
Most former enslaved, reaching Canada by boat across Lake Erie and Lake Ontario, settled in Ontario. More than 30,000 people were said to have escaped there via the network during its 20-year peak period, although U.S. census figures account for only 6,000.
Numerous fugitives' stories are documented in the 1872 book The Underground Railroad Records by William Still, an abolitionist who then headed the Philadelphia Vigilance Committee.
Estimates vary widely, but at least 30,000 slaves, and potentially more than 100,000, escaped to Canada via the Underground Railroad. The largest group settled in Upper Canada (Ontario), called Canada West from 1841. Numerous Black Canadian communities developed in Southern Ontario.
These were generally in the triangular region bounded by Niagara Falls, Toronto, and Windsor. Several rural villages made up mostly of people freed from slavery were established in Kent and Essex counties in Ontario.
Fort Malden, in Amherstburg, Ontario, was deemed the "chief place of entry" for escaped slaves seeking to enter Canada. The abolitionist Levi Coffin, who was known for aiding over 2,000 fugitives to safety, supported this choice. He described Fort Malden as "the great landing place, the principle terminus of the underground railroad of the west."
After 1850, approximately thirty people a day were crossing over to Fort Malden by steamboat. The Sultana was one of the ships, making "frequent round trips" between Great Lakes ports. Its captain, C.W. Appleby, a celebrated mariner, facilitated the conveyance of several fugitives from various Lake Erie ports to Fort Malden.
Other fugitives at Fort Walden had been assisted by William Wells Brown, himself someone who had escaped slavery. He found employment on a Lake Erie steamer and transported numerous fugitives from Cleveland to Ontario by way of Buffalo or Detroit. "It is well known", he tells us, "that a great number of fugitives make their escape to Canada, by way of Cleaveland. ...The friends of the slave, knowing that I would transport them without charge, never failed to have a delegation when the boat arrived at Cleaveland. I have sometimes had four or five on board at one time."
Another important destination was Nova Scotia, which was first settled by Black Loyalists during the American Revolution and then by Black Refugees during the War of 1812 (see Black Nova Scotians).
Important Black settlements also developed in other parts of British North America (now parts of Canada). These included Lower Canada (present-day Quebec) and Vancouver Island, where Governor James Douglas encouraged Black immigration because of his opposition to slavery. He also hoped a significant Black community would form a bulwark against those who wished to unite the island with the United States.
Upon arriving at their destinations, many freedom seekers were disappointed, as life in Canada was difficult. While not at risk from slave catchers due to being in a different country, racial discrimination was still widespread.
Many of the new arrivals had to compete with mass European immigration for jobs, and overt racism was common. For example, in reaction to Black Loyalists being settled in eastern Canada by the Crown, the city of Saint John, New Brunswick, amended its charter in 1785 specifically to exclude Blacks from practicing a trade, selling goods, fishing in the harbor, or becoming freemen; these provisions stood until 1870.
With the outbreak of the Civil War in the U.S., many black refugees left Canada to enlist in the Union Army. While some later returned to Canada, many remained in the United States.
Thousands of others returned to the American South after the war ended. The desire to reconnect with friends and family was strong, and most were hopeful about the changes emancipation and Reconstruction would bring.
Folklore:
Main articles:
Since the 1980s, claims have arisen that quilt designs were used to signal and direct enslaved people to escape routes and assistance. According to advocates of the quilt theory, ten quilt patterns were used to direct enslaved people to take particular actions. The quilts were placed one at a time on a fence as a means of nonverbal communication to alert escaping slaves. The code had a dual meaning: first to signal enslaved people to prepare to escape, and second to give clues and indicate directions on the journey.
The quilt design theory is disputed. The first published work documenting an oral history source was in 1999, and the first publication of this theory is believed to be a 1980 children's book. Quilt historians and scholars of pre-Civil War (1820–1860) America have disputed this legend. There is no contemporary evidence of any sort of quilt code, and quilt historians such as Pat Cummings and Barbara Brackman have raised serious questions about the idea. In addition, Underground Railroad historian Giles Wright has published a pamphlet debunking the quilt code.
Similarly, some popular, nonacademic sources claim that spirituals and other songs, such as "Steal Away" or "Follow the Drinking Gourd", contained coded information and helped individuals navigate the railroad. They have offered little evidence to support their claims. Scholars tend to believe that while the slave songs may certainly have expressed hope for deliverance from the sorrows of this world, these songs did not present literal help for runaway slaves.
The Underground Railroad inspired cultural works. For example, "Song of the Free", written in 1860 about a man fleeing slavery in Tennessee by escaping to Canada, was composed to the tune of "Oh! Susanna". Every stanza ends with a reference to Canada as the land "where colored men are free".
Slavery in Upper Canada (now Ontario) was outlawed in 1793; in 1819, John Robinson, the Attorney General of Upper Canada, declared that by residing in Canada, black residents were set free, and that Canadian courts would protect their freedom. Slavery in Canada as a whole had been in rapid decline after an 1803 court ruling, and was finally abolished outright in 1834.
Legal and political:
When frictions between North and South culminated in the Civil War, many Black people, both enslaved and free, fought for the Union Army. Following Union victory in the Civil War, on December 6, 1865, the Thirteenth Amendment to the Constitution outlawed slavery except as punishment for a crime.
Following its passage, in some cases the Underground Railroad operated in the opposite direction, as people who had escaped to Canada returned to the United States.
Criticism:
Frederick Douglass was a writer, statesman, and had escaped slavery. He wrote critically of the attention drawn to the ostensibly secret Underground Railroad in his first autobiography, Narrative of the Life of Frederick Douglass, an American Slave (1845):
"I have never approved of the very public manner in which some of our western friends have conducted what they call the Underground Railroad, but which I think, by their open declarations, has been made most emphatically the upperground railroad."
He went on to say that, although he honors the movement, he felt that the efforts at publicity serve more to enlighten the slave-owners than the slaves, making them more watchful and making it more difficult for future slaves to escape.
Notable People:
See also: Category:Underground Railroad people
South to Florida and Mexico:
Background:
Beginning in the 16th century, Spaniards brought enslaved Africans to New Spain, including Mission Nombre de Dios in what would become the city of St. Augustine in Spanish Florida.
Over time, free Afro-Spaniards took up various trades and occupations and served in the colonial militia. After King Charles II of Spain proclaimed Spanish Florida a safe haven for escaped slaves from British North America, they began escaping to Florida by the hundreds from as far north as New York. The Spanish established Fort Mose for the free Blacks in the St. Augustine area in 1738.
In 1806, enslaved people arrived at the Stone Fort in Nacogdoches, Texas seeking freedom. They arrived with a forged passport from a Kentucky judge. The Spanish refused to return them back to the United States. More freedom seekers traveled through Texas the following year.
Enslaved people were emancipated by crossing the border from the United States into Mexico, which was a Spanish colony into the nineteenth century. In the United States, enslaved people were considered property. That meant that they did not have rights to marry and they could be sold away from their partners. They also did not have rights to fight inhumane and cruel punishment.
In New Spain, fugitive slaves were recognized as humans. They were allowed to join the Catholic Church and marry. They also were protected from inhumane and cruel punishment.
During the War of 1812, U.S. Army general Andrew Jackson invaded Spanish Florida in part because enslaved people had run away from plantations in the Carolinas and Georgia to Florida. Some of the runaways joined the Black Seminoles who later moved to Mexico.
Mexico sent mixed signals, though, on their position against slavery. Sometimes they allowed enslaved people to returned to slavery and they allowed Americans to move into Spanish territorial property in order to populate the North, where the Americans would then establish cotton plantations, bringing enslaved people to work the land.
In 1829, Mexican president Vicente Guerrero (who was a mixed race black man) formally abolished slavery in Mexico. Freedom seekers from Southern plantations in the Deep South, particularly from Louisiana, Mississippi and Texas, escaped slavery and headed for Mexico.
At that time, Texas was part of Mexico. The Texas Revolution, initiated in part to legalize slavery, resulted in the formation of the Republic of Texas in 1836. Following the Battle of San Jacinto, there were some enslaved people who withdrew from the Houston area with the Mexican army, seeing the troops as a means to escape slavery.
When Texas joined the Union in 1845, it was a slave state and the Rio Grande became the international border with Mexico.
Pressure between free and slave states deepened as Mexico abolished slavery in 1837 and western states joined the Union as free states. As more free states were added to the Union, the lesser the influence of slave state representatives in Congress.
Slave states and slave hunters:
The Southern Underground Railroad went through slave states, lacking the abolitionist societies and the organized system of the north. People who spoke out against slavery were subject to mobs, physical assault, and being hanged.
There were slave catchers who looked for runaway slaves. There were never more than a few hundred free blacks in Texas, which meant that free blacks did not feel safe in the state. The network to freedom was informal, random, and dangerous.
U.S. military forts, established along the Rio Grande border during the Mexican–American War of the 1840s, captured and returned fleeing enslaved people to their slaveholders.
The Fugitive Slave Act of 1850 made it a criminal act to aid fleeing escaping enslaved people in free states. Similarly, the United States government wanted to enact a treaty with Mexico so that they would help capture and return bonds-people. Mexico, however, continued their practice to allow anyone that crossed their borders to be free. Slave catchers continued to cross the southern border into Mexico and illegally capture black people and return them to slavery. A group of slave hunters became the Texas Rangers.
Routes:
Thousands of freedom seekers traveled along a network from the southern United States to Texas and ultimately Mexico. Southern enslaved people generally traveled across "unforgiving country" on foot or horseback while pursued by lawmen and slave hunters.
Some stowed away on ferries bound for a Mexican port from New Orleans, Louisiana and Galveston, Texas. There were some who transported cotton to Brownsville, Texas on wagons and then crossed into Mexico at Matamoros.
"Sometimes someone would come 'long and try to get us to run up north and be free. We used to laugh at that" —Former slave Felix Haywood, interviewed in 1937 for the federal Slave Narrative Project.
Many traveled through North Carolina, Arkansas, Alabama, Louisiana, or Mississippi toward Texas and ultimately Mexico. People fled slavery from Indian Territory (now Oklahoma).
Black Seminoles traveled on a southwestern route from Florida into Mexico. Going overland meant that the last 150 miles or so were traversed through the difficult and extremely hot terrain of the Nueces Strip located between the Nueces River and the Rio Grande. There was little shade and a lack of potable water in this brush country.
Escapees were more likely to survive the trip if they had a horse and a gun.
The National Park Service identified a route from Natchitoches, Louisiana to Monclova, Mexico in 2010 that is roughly the southern Underground Railroad path. It is also believed that the El Camino Real de los Tejas was a path for freedom. It was made a National Historic Trail by President George W. Bush in 2004.
Assistance:
Some journeyed on their own without assistance, and others were helped by people along the southern Underground Railroad. Assistance included guidance, directions, shelter, and supplies.
Black people, black and white couples, and anti-slavery German immigrants provided support, but most of the help came from Mexican laborers. So much so that enslavers came to distrust any Mexican, and a law was enacted in Texas that forbade Mexicans from talking to enslaved people.
Mexican migrant workers developed relationships with enslaved black workers whom they worked with. They offered guidance, such as what it would be like to cross the border, and empathy. Having realized the ways in which Mexicans were helping enslaved people to escape, slaveholders and residents of Texan towns pushed people out of the town, whipped them in public, or lynched them.
Some border officials helped enslaved people crossing into Mexico. In Monclova, Mexico a border official took up a collection in the town for a family in need of food, clothing, and money to continue on their journey south and out of reach of slave hunters.
Once they crossed the border, some Mexican authorities helped former enslaved people from being returned to the United States by slave hunters.
Freedom seekers that were taken on ferries to Mexican ports were aided by Mexican ship captains, one of whom was caught in Louisiana and indicted for helping enslaved people escape.
Knowing the repercussions of running away or being caught helping someone runaway, people were careful to cover their tracks, and public and personal records about fugitive slaves are scarce. In greater supply are records by people who promoted slavery or attempted to catch fugitive slaves. More than 2,500 escapes are documented by the Texas Runaway Slave Project at Stephen F. Austin State University.
Southern freedom seekers:
Advertisements were placed in newspapers offering rewards for the return of their "property". Slave catchers traveled through Mexico. There were Black Seminoles, or Los Mascogos who lived in northern Mexico who provided armed resistance.
Sam Houston, president of the Republic of Texas, was the slaveholder to Tom who ran away. He headed to Texas and once there he enlisted in the Mexican military.
One enslaved man was branded with the letter "R" on each side of his cheek after a failed attempt to escape slavery. He tried again in the winter of 1819, leaving the cotton plantation of his enslaver on horseback. With four others, they traveled southwest to Mexico at the risk of being attacked by hostile Native Americans, apprehended by slave catchers, or attacked by "horse-eating alligators".
Many people did not make it to Mexico. In 1842, a Mexican man and a black woman left Jackson County, Texas on two horses, but they were caught at the Lavaca River. The wife, an enslaved woman, was valuable to her owner so she was returned to slavery. Her husband, possibly a farm laborer or an indentured servant, was immediately lynched.
Fugitive slaves changed their names in Mexico. They married into Mexican families and relocated further south of the American-Mexican border. All of these factors makes it hard to trace the whereabouts of the formerly enslaved people.
A database at Stephen F. Austin State University has a database of runaway slave advertisements as part of The Texas Runaway Slave Project. The Works Progress Administration during the Great Depression initiated a Federal Writers' Project to document slave narratives, including those who settled in Mexico. One of them was Felix Haywood, who found freedom when he crossed the Rio Grande.
Rio Grande stations:
Two families, the Webbers and the Jacksons, lived along the Rio Grande and helped people escape slavery. The husbands were white and the wives were black women who had been formerly enslaved. It is not known if Nathaniel Jackson purchased the freedom of Matilda Hicks and her family, but in the early 1860s they moved to Hidalgo county, where they settled and lived as a family.
He was a white southerner and she was an enslaved woman, who had been childhood sweethearts in Alabama. He was the son of her slaveholder, who helped a group of seven families in 1857 and others cross into Mexico.
Silvia Hector Webber was born enslaved in West Florida and in 1819 was sold to a slaveholder in Clark County, Arkansas. The slaveholders's son, John Cryer, illegally brought Silvia to Mexican Texas in 1828, four years after Mexico had deemed the slave trade into Mexican territory against the law. Silvia, however, with the help of John Webber secured her and her 3 children's freedom papers in 1834.
Together Silvia and John lived an antislavery life and often harbored fugitives from slavery in their ranch and house. Silvia was known to transport freedom seekers, on a ferry she licensed at her ranch, onto freedom in Mexico.
John Ferdinand Webber, born in Vermont, lived along the Rio Grande with his wife, Silvia Hector Webber, and together were known to have helped enslaved people cross the Rio Grande. The Jacksons and Webbers, who both owned licensed ferry service, were well known among runaways.
Arrival in Mexico:
Fugitive slaves who made it to Mexico lived with the knowledge that they could be illegally kidnapped by slave catchers or blackbirders. Slave hunters who tried to kidnap former slaves from Mexico could be taken to court or shot. There was little support from their new communities and few opportunities for employment. They did not have official paperwork that stated that they were free. They were, though, able to enter into indentured servitude contracts and join military colonies.
Some people, after they settled in Mexico, returned to the United States to help family members escape and to guide them to Mexico.
Colonies:
There were abolitionists from the north who petitioned the Mexican government to establish colonies for free and runaway blacks. Benjamin Lundy, a Quaker, lobbied for a colony to be established in what is now Texas during the early 1830s, but he was unable to do so when Texas legalized slavery when it separated from Mexico and became the Republic of Texas (1836).
Black Seminoles successfully petitioned for land and established a colony in 1852. The land is still owned by their descendants.
See also:
Scholarship:
The Texas Runaway Slave Project, located in Nacogdoches at the Stephen F. Austin State University, has researched runaway advertisements that appeared in 19,000 editions of newspapers from the mid-19th century.
Alice L. Baumgartner has studied the prevalence of people who fled slavery from the Southern states to Mexico. She published South to Freedom: Runaway Slaves to Mexico and the Road to the Civil War.
Thomas Mareite completed a doctoral dissertation at Leiden University on the social and political experiences of enslaved people who escaped from the U.S. South to Mexico, titled Conditional Freedom: Free Soil and Fugitive Slaves from the U.S. South to Mexico's Northeast, 1803-1861.
Roseann Bacha-Garza, of the University of Texas Rio Grande Valley, has managed historical archeology projects and has researched the incidence of enslaved people who fled to Mexico.
Mekala Audain has also published a chapter titled "A Scheme to Desert: The Louisiana Purchase and Freedom Seekers in the Louisiana-Texas Borderlands, 1804-1806" in the edited volume In Search of Liberty: African American Internationalism in the Nineteenth-Century Atlantic World.
Maria Esther Hammack completed her doctoral dissertation on the subject in 2021 at the University of Texas at Austin.
National Underground Railroad Network:
Following upon legislation passed in 1990 for the National Park Service to perform a special resource study of the Underground Railroad, in 1997, the 105th Congress introduced and subsequently passed H.R. 1635 – National Underground Railroad Network to Freedom Act of 1998, which President Bill Clinton signed into law in 1998.
This act authorized the United States National Park Service to establish the National Underground Railroad Network to Freedom program to identify associated sites, as well as preserve them and popularize the Underground Railroad and stories of people involved in it.
The National Park Service has designated many sites within the network, posted stories about people and places, sponsors an essay contest, and holds a national conference about the Underground Railroad in May or June each year.
The Harriet Tubman Underground Railroad National Historical Park, which includes Underground Railroad routes in three counties of Maryland's Eastern Shore and Harriet Tubman's birthplace, was created by President Barack Obama under the Antiquities Act on March 25, 2013.
Its sister park, the Harriet Tubman National Historical Park in Auburn, New York, was established on January 10, 2017, and focuses on the later years of Tubman's life as well as her involvement with the Underground Railroad and the abolition movement.
In popular culture:
Inspirations for fiction:
Literature:
Music:
Underground Railroad was a company created by Tupac Shakur, Big D the Impossible, Shock G, Pee Wee, Jeremy, Raw Fusion and Live Squad with the purpose of promote and help young black women and men with records allowing them to initiate and develop their musical careers.
See also:
The Underground Railroad was a network of secret routes and safe houses established in the United States during the early to mid-19th century. It was used by enslaved African Americans primarily to escape into free states and from there to Canada.
The network, primarily the work of free African Americans, was assisted by abolitionists and others sympathetic to the cause of the escapees. The slaves who risked capture and those who aided them are also collectively referred to as the passengers and conductors of the "Underground Railroad".
Various other routes led to Mexico, where slavery had been abolished, and to islands in the Caribbean that were not part of the slave trade. An earlier escape route running south toward Florida, then a Spanish possession (except 1763–1783), existed from the late 17th century until approximately 1790.
However, the network now generally known as the Underground Railroad began in the late 18th century. It ran north and grew steadily until the Emancipation Proclamation was signed by President Abraham Lincoln. One estimate suggests that, by 1850, approximately 100,000 slaves had escaped to freedom via the network.
Origin of the name:
Eric Foner wrote that the term "was perhaps first used by a Washington newspaper in 1839, quoting a young slave hoping to escape bondage via a railroad that 'went underground all the way to Boston'". Dr. Robert Clemens Smedley wrote that following slave catchers' failed searches and lost traces of fugitives as far north as Columbia, Pennsylvania, they declared in bewilderment that "there must be an underground railroad somewhere," giving origin to the term.
Scott Shane wrote that the first documented use of the term was in an article written by Thomas Smallwood in the August 10, 1842, edition of Tocsin of Liberty, an abolitionist newspaper published in Albany. He also wrote that the 1879 book Sketches in the History of the Underground Railroad said the phrase was mentioned in an 1839 Washington newspaper article and that the book's author said 40 years later that he had quoted the article from memory as closely as he could.
Political background:
For the fugitive slaves who "rode" the Underground Railroad, many of them considered Canada their final destination. An estimated 30,000 to 40,000 of them settled in Canada, half of whom came between 1850 and 1860. Others settled in free states in the north.
Thousands of court cases for fugitive slaves were recorded between the Revolutionary War and the Civil War. Under the original Fugitive Slave Act of 1793, officials from free states were required to assist slaveholders or their agents who recaptured fugitives, but some state legislatures prohibited this. The law made it easier for slaveholders and slave catchers to capture African Americans and return them to slavery, and in some cases allowed them to enslave free blacks. It also created an eagerness among abolitionists to help enslaved people, resulting in the growth of anti-slavery societies and the Underground Railroad.
With heavy lobbying by Southern politicians, the Compromise of 1850 was passed by Congress after the Mexican–American War. It included a more stringent Fugitive Slave Law; ostensibly, the compromise addressed regional problems by compelling officials of free states to assist slave catchers, granting them immunity to operate in free states. Because the law required sparse documentation to claim a person was a fugitive, slave catchers also kidnapped free blacks, especially children, and sold them into slavery.
Southern politicians often exaggerated the number of escaped slaves and often blamed these escapes on Northerners interfering with Southern property rights. The law deprived people suspected of being slaves of the right to defend themselves in court, making it difficult to prove free status. Some Northern states enacted personal liberty laws that made it illegal for public officials to capture or imprison former slaves.
The perception that Northern states ignored the fugitive slave laws and regulations was a major justification offered for secession.
Routes:
Underground Railroad routes went north to free states and Canada, to the Caribbean, into United States western territories, and Indian territories. Some fugitive slaves traveled south into Mexico for their freedom. Many escaped by sea, including Ona Judge, who had been enslaved by President George Washington.
North to free states and Canada:
Structure:
Further information: Quakers in the abolition movement
Despite the thoroughfare's name, the escape network was neither literally underground nor a railroad. (The first literal underground railroad did not exist until 1863.) According to John Rankin, "It was so called because they who took passage on it disappeared from public view as really as if they had gone into the ground. After the fugitive slaves entered a depot on that road no trace of them could be found. They were secretly passed from one depot to another until they arrived at a destination where they were able to remain free."
It was known as a railroad, using rail terminology such as stations and conductors, because that was the transportation system in use at the time.
The Underground Railroad did not have a headquarters or governing body, nor were there published guides, maps, pamphlets, or even newspaper articles. The Underground Railroad consisted of meeting points, secret routes, transportation, and safe houses, all of them maintained by abolitionist sympathizers and communicated by word of mouth, although there is also a report of a numeric code used to encrypt messages.
Participants generally organized in small, independent groups; this helped to maintain secrecy. People escaping enslavement would move north along the route from one way station to the next. "Conductors" on the railroad came from various backgrounds and included free-born blacks, white abolitionists, the formerly enslaved (either escaped or manumitted), and Native Americans.
Believing that slavery was "contrary to the ethics of Jesus", Christian congregations and clergy played a role, especially the following:
- Religious Society of Friends (Quakers),
- Congregationalists,
- Wesleyan Methodists,
- and Reformed Presbyterians,
- as well as the anti-slavery branches of mainstream denominations which entered into schism over the issue, such as the Methodist Episcopal Church and the Baptists.
The role of free blacks was crucial; without it, there would have been almost no chance for fugitives from slavery to reach freedom safely. The groups of underground railroad "agents" worked in organizations known as vigilance committees.
Routes:
The Underground Railroad benefited greatly from the geography of the U.S.–Canada border: Michigan, Ohio, Pennsylvania and most of New York were separated from Canada by water, over which transport was usually easy to arrange and relatively safe.
The main route for freedom seekers from the South led up the Appalachians, Harriet Tubman going via Harpers Ferry, through the highly anti-slavery Western Reserve region of northeastern Ohio to the vast shore of Lake Erie, and then to Canada by boat.
A smaller number, traveling by way of New York or New England, went:
- via Syracuse (home of Samuel May)
- and Rochester, New York (home of Frederick Douglass),
- crossing the Niagara River or Lake Ontario into Canada.
- Those traveling via the New York Adirondacks,
- sometimes via Black communities like Timbuctoo, New York,
- entered Canada via
- Ogdensburg, on the St. Lawrence River,
- or on Lake Champlain (Joshua Young assisted).
The western route, used by John Brown among others, led from Missouri west to free Kansas and north to free Iowa, then east via Chicago to the Detroit River.
Terminology:
Members of the Underground Railroad often used specific terms, based on the metaphor of the railway. For example:
- People who helped fugitive slaves find the railroad were "agents"
- Guides were known as "conductors"
- Hiding places were "stations" or "way stations"
- "Station masters" hid escaping slaves in their homes
- People escaping slavery were referred to as "passengers" or "cargo"
- Fugitive slaves would obtain a "ticket"
- Similar to common gospel lore, the "wheels would keep on turning"
- Financial benefactors of the Railroad were known as "stockholders"
The Big Dipper (whose "bowl" points to the North Star) was known as the drinkin' gourd.
The Railroad was often known as the "freedom train" or "Gospel train", which headed towards "Heaven" or "the Promised Land", i.e., Canada.
William Still, sometimes called "The Father of the Underground Railroad", helped hundreds of slaves escape (as many as 60 a month), sometimes hiding them in his Philadelphia home. He kept careful records, including short biographies of the people, that contained frequent railway metaphors. He maintained correspondence with many of them, often acting as a middleman in communications between people who had escaped slavery and those left behind.
He later published these accounts in the book The Underground Railroad: Authentic Narratives and First-Hand Accounts (1872), a valuable resource for historians to understand how the system worked and learn about individual ingenuity in escapes.
According to Still, messages were often encoded so that they could be understood only by those active in the railroad. For example, the following message, "I have sent via at two o'clock four large hams and two small hams", indicated that four adults and two children were sent by train from Harrisburg to Philadelphia.
The additional word via indicated that the "passengers" were not sent on the usual train, but rather via Reading, Pennsylvania. In this case, the authorities were tricked into going to the regular location (station) in an attempt to intercept the runaways, while Still met them at the correct station and guided them to safety. They eventually escaped either further north or to Canada, where slavery had been abolished during the 1830s.
To reduce the risk of infiltration, many people associated with the Underground Railroad knew only their part of the operation and not of the whole scheme. "Conductors" led or transported the "passengers" from station to station. A conductor sometimes pretended to be enslaved to enter a plantation.
Once a part of a plantation, the conductor would direct the runaways to the North. Enslaved people traveled at night, about 10–20 miles (16–32 km) to each station. They rested, and then a message was sent to the next station to let the station master know the escapees were on their way. They would stop at the so-called "stations" or "depots" during the day and rest. The stations were often located in basements, barns, churches, or in hiding places in caves.
The resting spots where the freedom seekers could sleep and eat were given the code names "stations" and "depots", which were held by "station masters". "Stockholders" gave money or supplies for assistance. Using biblical references, fugitives referred to Canada as the "Promised Land" or "Heaven" and the Ohio River, which marked the boundary between slave states and free states, as the "River Jordan".
The majority of freedom seekers that escaped from slavery did not have help from an abolitionist. Although there are stories of black and white abolitionists helping freedom seekers escape from slavery many escapes were unaided.
Other Underground Railroad escape routes for freedom seekers were maroon communities. Maroon communities were hidden places, such as wetlands or marshes, where escaped slaves established their own independent communities. Examples of maroon communities in the United States include the Great Dismal Swamp in Virginia and Black Seminole communities in Florida, among others others.
Traveling conditions:
Although the freedom seekers sometimes traveled on boat or train, they usually traveled on foot or by wagon, sometimes lying down, covered with hay or similar products, in groups of one to three escapees. Some groups were considerably larger. Abolitionist Charles Turner Torrey and his colleagues rented horses and wagons and often transported as many as 15 or 20 people at a time.
Free and enslaved black men occupied as mariners (sailors) helped enslaved people escape from slavery by providing a ride on their ship, providing information on the safest and best escape routes, and safe locations on land, and locations of trusted people for assistance.
Enslaved African-American mariners had information about slave revolts occurring in the Caribbean, and relayed this news to enslaved people they had contact with in American ports. Free and enslaved African-American mariners assisted Harriet Tubman in her rescue missions. Black mariners provided to her information about the best escape routes, and helped her on her rescue missions.
Routes were often purposely indirect to confuse pursuers. Most escapes were by individuals or small groups; occasionally, there were mass escapes, such as with the Pearl incident. The journey was often considered particularly difficult and dangerous for women or children.
Children were sometimes hard to keep quiet or were unable to keep up with a group. In addition, enslaved women were rarely allowed to leave the plantation, making it harder for them to escape in the same ways that men could. Although escaping was harder for women, some women were successful.
One of the most famous and successful conductors (people who secretly traveled into slave states to rescue those seeking freedom) was Harriet Tubman, a woman who escaped slavery.
Due to the risk of discovery, information about routes and safe havens was passed along by word of mouth, although in 1896 there is a reference to a numerical code used to encrypt messages. Southern newspapers of the day were often filled with pages of notices soliciting information about fugitive slaves and offering sizable rewards for their capture and return. Federal marshals and professional bounty hunters known as slave catchers pursued freedom seekers as far as the Canada–U.S. border.
"Reverse Underground Railroad":
Freedom seekers were not the only black people at risk from slave catchers. With demand for slaves high in the Deep South as cotton was planted, strong, healthy blacks in their prime working and reproductive years were seen and treated as highly valuable commodities.
Both former slaves and free blacks were sometimes kidnapped and sold into slavery, as in the well-documented case of Solomon Northup, a New York-born free black who was kidnapped by Southern slavers while visiting Washington, DC. "Certificates of Freedom," signed, notarized statements attesting to the free status of individual Blacks also known as free papers, could easily be destroyed or stolen, so they provided little protection.
Some buildings, such as the Crenshaw House in far-southeastern Illinois, are known sites where free blacks were sold into slavery, known as the "Reverse Underground Railroad".
Fugitive Slave Act of 1850:
Under the terms of the Fugitive Slave Act of 1850, when suspected fugitives were seized and brought to a special magistrate known as a commissioner, they had no right to a jury trial and could not testify on their own behalf. Technically, they were not accused of a crime. The marshal or private slave-catcher needed only to swear an oath to acquire a writ of replevin for the return of property.
Congress was dominated by Southern congressmen because the population of their states was bolstered by the inclusion of three-fifths of the number of slaves in population totals. They passed the Fugitive Slave Law of 1850 because of frustration at having fugitives from slavery helped by the public and even official institutions outside the South. In some parts of the North, slave-catchers needed police protection.
Arrival in Canada:
See also: American immigration to Canada and Slavery in Canada
British North America (present-day Canada) was a desirable destination, as its long border gave many points of access, it was farther from slave catchers, and beyond the reach of the United States' Fugitive Slave Acts. Further, slavery ended decades earlier in Canada than in the United States.
Britain banned the institution of slavery in present-day Canada (and in most British colonies) in 1833, though the practice of slavery in Canada had effectively ended already early in the 19th century through case law, due to court decisions resulting from litigation on behalf of slaves seeking manumission.
Most former enslaved, reaching Canada by boat across Lake Erie and Lake Ontario, settled in Ontario. More than 30,000 people were said to have escaped there via the network during its 20-year peak period, although U.S. census figures account for only 6,000.
Numerous fugitives' stories are documented in the 1872 book The Underground Railroad Records by William Still, an abolitionist who then headed the Philadelphia Vigilance Committee.
Estimates vary widely, but at least 30,000 slaves, and potentially more than 100,000, escaped to Canada via the Underground Railroad. The largest group settled in Upper Canada (Ontario), called Canada West from 1841. Numerous Black Canadian communities developed in Southern Ontario.
These were generally in the triangular region bounded by Niagara Falls, Toronto, and Windsor. Several rural villages made up mostly of people freed from slavery were established in Kent and Essex counties in Ontario.
Fort Malden, in Amherstburg, Ontario, was deemed the "chief place of entry" for escaped slaves seeking to enter Canada. The abolitionist Levi Coffin, who was known for aiding over 2,000 fugitives to safety, supported this choice. He described Fort Malden as "the great landing place, the principle terminus of the underground railroad of the west."
After 1850, approximately thirty people a day were crossing over to Fort Malden by steamboat. The Sultana was one of the ships, making "frequent round trips" between Great Lakes ports. Its captain, C.W. Appleby, a celebrated mariner, facilitated the conveyance of several fugitives from various Lake Erie ports to Fort Malden.
Other fugitives at Fort Walden had been assisted by William Wells Brown, himself someone who had escaped slavery. He found employment on a Lake Erie steamer and transported numerous fugitives from Cleveland to Ontario by way of Buffalo or Detroit. "It is well known", he tells us, "that a great number of fugitives make their escape to Canada, by way of Cleaveland. ...The friends of the slave, knowing that I would transport them without charge, never failed to have a delegation when the boat arrived at Cleaveland. I have sometimes had four or five on board at one time."
Another important destination was Nova Scotia, which was first settled by Black Loyalists during the American Revolution and then by Black Refugees during the War of 1812 (see Black Nova Scotians).
Important Black settlements also developed in other parts of British North America (now parts of Canada). These included Lower Canada (present-day Quebec) and Vancouver Island, where Governor James Douglas encouraged Black immigration because of his opposition to slavery. He also hoped a significant Black community would form a bulwark against those who wished to unite the island with the United States.
Upon arriving at their destinations, many freedom seekers were disappointed, as life in Canada was difficult. While not at risk from slave catchers due to being in a different country, racial discrimination was still widespread.
Many of the new arrivals had to compete with mass European immigration for jobs, and overt racism was common. For example, in reaction to Black Loyalists being settled in eastern Canada by the Crown, the city of Saint John, New Brunswick, amended its charter in 1785 specifically to exclude Blacks from practicing a trade, selling goods, fishing in the harbor, or becoming freemen; these provisions stood until 1870.
With the outbreak of the Civil War in the U.S., many black refugees left Canada to enlist in the Union Army. While some later returned to Canada, many remained in the United States.
Thousands of others returned to the American South after the war ended. The desire to reconnect with friends and family was strong, and most were hopeful about the changes emancipation and Reconstruction would bring.
Folklore:
Main articles:
Since the 1980s, claims have arisen that quilt designs were used to signal and direct enslaved people to escape routes and assistance. According to advocates of the quilt theory, ten quilt patterns were used to direct enslaved people to take particular actions. The quilts were placed one at a time on a fence as a means of nonverbal communication to alert escaping slaves. The code had a dual meaning: first to signal enslaved people to prepare to escape, and second to give clues and indicate directions on the journey.
The quilt design theory is disputed. The first published work documenting an oral history source was in 1999, and the first publication of this theory is believed to be a 1980 children's book. Quilt historians and scholars of pre-Civil War (1820–1860) America have disputed this legend. There is no contemporary evidence of any sort of quilt code, and quilt historians such as Pat Cummings and Barbara Brackman have raised serious questions about the idea. In addition, Underground Railroad historian Giles Wright has published a pamphlet debunking the quilt code.
Similarly, some popular, nonacademic sources claim that spirituals and other songs, such as "Steal Away" or "Follow the Drinking Gourd", contained coded information and helped individuals navigate the railroad. They have offered little evidence to support their claims. Scholars tend to believe that while the slave songs may certainly have expressed hope for deliverance from the sorrows of this world, these songs did not present literal help for runaway slaves.
The Underground Railroad inspired cultural works. For example, "Song of the Free", written in 1860 about a man fleeing slavery in Tennessee by escaping to Canada, was composed to the tune of "Oh! Susanna". Every stanza ends with a reference to Canada as the land "where colored men are free".
Slavery in Upper Canada (now Ontario) was outlawed in 1793; in 1819, John Robinson, the Attorney General of Upper Canada, declared that by residing in Canada, black residents were set free, and that Canadian courts would protect their freedom. Slavery in Canada as a whole had been in rapid decline after an 1803 court ruling, and was finally abolished outright in 1834.
Legal and political:
When frictions between North and South culminated in the Civil War, many Black people, both enslaved and free, fought for the Union Army. Following Union victory in the Civil War, on December 6, 1865, the Thirteenth Amendment to the Constitution outlawed slavery except as punishment for a crime.
Following its passage, in some cases the Underground Railroad operated in the opposite direction, as people who had escaped to Canada returned to the United States.
Criticism:
Frederick Douglass was a writer, statesman, and had escaped slavery. He wrote critically of the attention drawn to the ostensibly secret Underground Railroad in his first autobiography, Narrative of the Life of Frederick Douglass, an American Slave (1845):
"I have never approved of the very public manner in which some of our western friends have conducted what they call the Underground Railroad, but which I think, by their open declarations, has been made most emphatically the upperground railroad."
He went on to say that, although he honors the movement, he felt that the efforts at publicity serve more to enlighten the slave-owners than the slaves, making them more watchful and making it more difficult for future slaves to escape.
Notable People:
See also: Category:Underground Railroad people
- Ann Bamford
- John Brown
- Owen Brown (father)
- Owen Brown (son)
- Samuel Burris
- Obadiah Bush
- Levi Coffin
- Elizabeth Rous Comstock
- George Corson
- Moses Dickson
- Frederick Douglass
- Asa Drury
- George Hussey Earle Sr.
- Calvin Fairbank
- Bartholomew Fussell
- Matilda Joslyn Gage
- Thomas Galt
- Thomas Garrett
- Sydney Howard Gay
- Josiah Bushnell Grinnell
- Frances Harper
- Laura Smith Haviland
- Lewis Hayden
- John Hunn
- Roger Hooker Leavitt
- Jermain Wesley Loguen
- Samuel Joseph May
- John Berry Meachum
- Mary Meachum
- William M. Mitchell
- Solomon Northup
- John Parker
- Elijah F. Pennypacker
- Mary Ellen Pleasant
- John Wesley Posey
- Amy and Isaac Post
- John Rankin
- Alexander Milton Ross
- David Ruggles
- Gerrit Smith
- George Luther Stearns
- William Still
- John Ton
- Charles Turner Torrey
- William Troy
- Harriet Tubman
- Martha Coffin Wright
- John Van Zandt
- Bernardhus Van Leer
- Silvia and John Webber
South to Florida and Mexico:
Background:
Beginning in the 16th century, Spaniards brought enslaved Africans to New Spain, including Mission Nombre de Dios in what would become the city of St. Augustine in Spanish Florida.
Over time, free Afro-Spaniards took up various trades and occupations and served in the colonial militia. After King Charles II of Spain proclaimed Spanish Florida a safe haven for escaped slaves from British North America, they began escaping to Florida by the hundreds from as far north as New York. The Spanish established Fort Mose for the free Blacks in the St. Augustine area in 1738.
In 1806, enslaved people arrived at the Stone Fort in Nacogdoches, Texas seeking freedom. They arrived with a forged passport from a Kentucky judge. The Spanish refused to return them back to the United States. More freedom seekers traveled through Texas the following year.
Enslaved people were emancipated by crossing the border from the United States into Mexico, which was a Spanish colony into the nineteenth century. In the United States, enslaved people were considered property. That meant that they did not have rights to marry and they could be sold away from their partners. They also did not have rights to fight inhumane and cruel punishment.
In New Spain, fugitive slaves were recognized as humans. They were allowed to join the Catholic Church and marry. They also were protected from inhumane and cruel punishment.
During the War of 1812, U.S. Army general Andrew Jackson invaded Spanish Florida in part because enslaved people had run away from plantations in the Carolinas and Georgia to Florida. Some of the runaways joined the Black Seminoles who later moved to Mexico.
Mexico sent mixed signals, though, on their position against slavery. Sometimes they allowed enslaved people to returned to slavery and they allowed Americans to move into Spanish territorial property in order to populate the North, where the Americans would then establish cotton plantations, bringing enslaved people to work the land.
In 1829, Mexican president Vicente Guerrero (who was a mixed race black man) formally abolished slavery in Mexico. Freedom seekers from Southern plantations in the Deep South, particularly from Louisiana, Mississippi and Texas, escaped slavery and headed for Mexico.
At that time, Texas was part of Mexico. The Texas Revolution, initiated in part to legalize slavery, resulted in the formation of the Republic of Texas in 1836. Following the Battle of San Jacinto, there were some enslaved people who withdrew from the Houston area with the Mexican army, seeing the troops as a means to escape slavery.
When Texas joined the Union in 1845, it was a slave state and the Rio Grande became the international border with Mexico.
Pressure between free and slave states deepened as Mexico abolished slavery in 1837 and western states joined the Union as free states. As more free states were added to the Union, the lesser the influence of slave state representatives in Congress.
Slave states and slave hunters:
The Southern Underground Railroad went through slave states, lacking the abolitionist societies and the organized system of the north. People who spoke out against slavery were subject to mobs, physical assault, and being hanged.
There were slave catchers who looked for runaway slaves. There were never more than a few hundred free blacks in Texas, which meant that free blacks did not feel safe in the state. The network to freedom was informal, random, and dangerous.
U.S. military forts, established along the Rio Grande border during the Mexican–American War of the 1840s, captured and returned fleeing enslaved people to their slaveholders.
The Fugitive Slave Act of 1850 made it a criminal act to aid fleeing escaping enslaved people in free states. Similarly, the United States government wanted to enact a treaty with Mexico so that they would help capture and return bonds-people. Mexico, however, continued their practice to allow anyone that crossed their borders to be free. Slave catchers continued to cross the southern border into Mexico and illegally capture black people and return them to slavery. A group of slave hunters became the Texas Rangers.
Routes:
Thousands of freedom seekers traveled along a network from the southern United States to Texas and ultimately Mexico. Southern enslaved people generally traveled across "unforgiving country" on foot or horseback while pursued by lawmen and slave hunters.
Some stowed away on ferries bound for a Mexican port from New Orleans, Louisiana and Galveston, Texas. There were some who transported cotton to Brownsville, Texas on wagons and then crossed into Mexico at Matamoros.
"Sometimes someone would come 'long and try to get us to run up north and be free. We used to laugh at that" —Former slave Felix Haywood, interviewed in 1937 for the federal Slave Narrative Project.
Many traveled through North Carolina, Arkansas, Alabama, Louisiana, or Mississippi toward Texas and ultimately Mexico. People fled slavery from Indian Territory (now Oklahoma).
Black Seminoles traveled on a southwestern route from Florida into Mexico. Going overland meant that the last 150 miles or so were traversed through the difficult and extremely hot terrain of the Nueces Strip located between the Nueces River and the Rio Grande. There was little shade and a lack of potable water in this brush country.
Escapees were more likely to survive the trip if they had a horse and a gun.
The National Park Service identified a route from Natchitoches, Louisiana to Monclova, Mexico in 2010 that is roughly the southern Underground Railroad path. It is also believed that the El Camino Real de los Tejas was a path for freedom. It was made a National Historic Trail by President George W. Bush in 2004.
Assistance:
Some journeyed on their own without assistance, and others were helped by people along the southern Underground Railroad. Assistance included guidance, directions, shelter, and supplies.
Black people, black and white couples, and anti-slavery German immigrants provided support, but most of the help came from Mexican laborers. So much so that enslavers came to distrust any Mexican, and a law was enacted in Texas that forbade Mexicans from talking to enslaved people.
Mexican migrant workers developed relationships with enslaved black workers whom they worked with. They offered guidance, such as what it would be like to cross the border, and empathy. Having realized the ways in which Mexicans were helping enslaved people to escape, slaveholders and residents of Texan towns pushed people out of the town, whipped them in public, or lynched them.
Some border officials helped enslaved people crossing into Mexico. In Monclova, Mexico a border official took up a collection in the town for a family in need of food, clothing, and money to continue on their journey south and out of reach of slave hunters.
Once they crossed the border, some Mexican authorities helped former enslaved people from being returned to the United States by slave hunters.
Freedom seekers that were taken on ferries to Mexican ports were aided by Mexican ship captains, one of whom was caught in Louisiana and indicted for helping enslaved people escape.
Knowing the repercussions of running away or being caught helping someone runaway, people were careful to cover their tracks, and public and personal records about fugitive slaves are scarce. In greater supply are records by people who promoted slavery or attempted to catch fugitive slaves. More than 2,500 escapes are documented by the Texas Runaway Slave Project at Stephen F. Austin State University.
Southern freedom seekers:
Advertisements were placed in newspapers offering rewards for the return of their "property". Slave catchers traveled through Mexico. There were Black Seminoles, or Los Mascogos who lived in northern Mexico who provided armed resistance.
Sam Houston, president of the Republic of Texas, was the slaveholder to Tom who ran away. He headed to Texas and once there he enlisted in the Mexican military.
One enslaved man was branded with the letter "R" on each side of his cheek after a failed attempt to escape slavery. He tried again in the winter of 1819, leaving the cotton plantation of his enslaver on horseback. With four others, they traveled southwest to Mexico at the risk of being attacked by hostile Native Americans, apprehended by slave catchers, or attacked by "horse-eating alligators".
Many people did not make it to Mexico. In 1842, a Mexican man and a black woman left Jackson County, Texas on two horses, but they were caught at the Lavaca River. The wife, an enslaved woman, was valuable to her owner so she was returned to slavery. Her husband, possibly a farm laborer or an indentured servant, was immediately lynched.
Fugitive slaves changed their names in Mexico. They married into Mexican families and relocated further south of the American-Mexican border. All of these factors makes it hard to trace the whereabouts of the formerly enslaved people.
A database at Stephen F. Austin State University has a database of runaway slave advertisements as part of The Texas Runaway Slave Project. The Works Progress Administration during the Great Depression initiated a Federal Writers' Project to document slave narratives, including those who settled in Mexico. One of them was Felix Haywood, who found freedom when he crossed the Rio Grande.
Rio Grande stations:
Two families, the Webbers and the Jacksons, lived along the Rio Grande and helped people escape slavery. The husbands were white and the wives were black women who had been formerly enslaved. It is not known if Nathaniel Jackson purchased the freedom of Matilda Hicks and her family, but in the early 1860s they moved to Hidalgo county, where they settled and lived as a family.
He was a white southerner and she was an enslaved woman, who had been childhood sweethearts in Alabama. He was the son of her slaveholder, who helped a group of seven families in 1857 and others cross into Mexico.
Silvia Hector Webber was born enslaved in West Florida and in 1819 was sold to a slaveholder in Clark County, Arkansas. The slaveholders's son, John Cryer, illegally brought Silvia to Mexican Texas in 1828, four years after Mexico had deemed the slave trade into Mexican territory against the law. Silvia, however, with the help of John Webber secured her and her 3 children's freedom papers in 1834.
Together Silvia and John lived an antislavery life and often harbored fugitives from slavery in their ranch and house. Silvia was known to transport freedom seekers, on a ferry she licensed at her ranch, onto freedom in Mexico.
John Ferdinand Webber, born in Vermont, lived along the Rio Grande with his wife, Silvia Hector Webber, and together were known to have helped enslaved people cross the Rio Grande. The Jacksons and Webbers, who both owned licensed ferry service, were well known among runaways.
Arrival in Mexico:
Fugitive slaves who made it to Mexico lived with the knowledge that they could be illegally kidnapped by slave catchers or blackbirders. Slave hunters who tried to kidnap former slaves from Mexico could be taken to court or shot. There was little support from their new communities and few opportunities for employment. They did not have official paperwork that stated that they were free. They were, though, able to enter into indentured servitude contracts and join military colonies.
Some people, after they settled in Mexico, returned to the United States to help family members escape and to guide them to Mexico.
Colonies:
There were abolitionists from the north who petitioned the Mexican government to establish colonies for free and runaway blacks. Benjamin Lundy, a Quaker, lobbied for a colony to be established in what is now Texas during the early 1830s, but he was unable to do so when Texas legalized slavery when it separated from Mexico and became the Republic of Texas (1836).
Black Seminoles successfully petitioned for land and established a colony in 1852. The land is still owned by their descendants.
See also:
Scholarship:
The Texas Runaway Slave Project, located in Nacogdoches at the Stephen F. Austin State University, has researched runaway advertisements that appeared in 19,000 editions of newspapers from the mid-19th century.
Alice L. Baumgartner has studied the prevalence of people who fled slavery from the Southern states to Mexico. She published South to Freedom: Runaway Slaves to Mexico and the Road to the Civil War.
Thomas Mareite completed a doctoral dissertation at Leiden University on the social and political experiences of enslaved people who escaped from the U.S. South to Mexico, titled Conditional Freedom: Free Soil and Fugitive Slaves from the U.S. South to Mexico's Northeast, 1803-1861.
Roseann Bacha-Garza, of the University of Texas Rio Grande Valley, has managed historical archeology projects and has researched the incidence of enslaved people who fled to Mexico.
Mekala Audain has also published a chapter titled "A Scheme to Desert: The Louisiana Purchase and Freedom Seekers in the Louisiana-Texas Borderlands, 1804-1806" in the edited volume In Search of Liberty: African American Internationalism in the Nineteenth-Century Atlantic World.
Maria Esther Hammack completed her doctoral dissertation on the subject in 2021 at the University of Texas at Austin.
National Underground Railroad Network:
Following upon legislation passed in 1990 for the National Park Service to perform a special resource study of the Underground Railroad, in 1997, the 105th Congress introduced and subsequently passed H.R. 1635 – National Underground Railroad Network to Freedom Act of 1998, which President Bill Clinton signed into law in 1998.
This act authorized the United States National Park Service to establish the National Underground Railroad Network to Freedom program to identify associated sites, as well as preserve them and popularize the Underground Railroad and stories of people involved in it.
The National Park Service has designated many sites within the network, posted stories about people and places, sponsors an essay contest, and holds a national conference about the Underground Railroad in May or June each year.
The Harriet Tubman Underground Railroad National Historical Park, which includes Underground Railroad routes in three counties of Maryland's Eastern Shore and Harriet Tubman's birthplace, was created by President Barack Obama under the Antiquities Act on March 25, 2013.
Its sister park, the Harriet Tubman National Historical Park in Auburn, New York, was established on January 10, 2017, and focuses on the later years of Tubman's life as well as her involvement with the Underground Railroad and the abolition movement.
In popular culture:
Inspirations for fiction:
- The Underground Railroad is a 2016 novel by Colson Whitehead. It won the 2016 National Book Award and the 2017 Pulitzer Prize for Fiction.
- The Underground Railroad is a 2021 streaming television limited series, based on Whitehead's novel.
- Underground is an American television series that premiered in 2016, on WGN America.
Literature:
- David Walker (1829) Appeal to the Coloured Citizens of the World
- Harriet Beecher Stowe (1852) Uncle Tom's Cabin
- Caroline Lee Hentz (1854) The Planter's Northern Bride
- William M. Mitchell (1860) The Under-Ground Railroad
- Sarah Hopkins Bradford (1869) Scenes in the Life of Harriet Tubman; (1896) Harriet Tubman, Moses of Her People
Music:
Underground Railroad was a company created by Tupac Shakur, Big D the Impossible, Shock G, Pee Wee, Jeremy, Raw Fusion and Live Squad with the purpose of promote and help young black women and men with records allowing them to initiate and develop their musical careers.
See also:
- Angola, Florida
- Ausable Chasm, NY, home of the North Star Underground Railroad Museum
- Bilger's Rocks
- Caroline Quarlls (1824–1892), first known person to escape slavery through Wisconsin's Underground Railroad
- Escape to Sweden, an "underground railroad" during the Holocaust in Norway
- Fort Mose Historic State Park
- List of Underground Railroad sites
- Reverse Underground Railroad
- Slave codes
- Tilly Escape
- Timbuctoo, New York
- Uncle Tom's Cabin Historic Site near Dresden, Ontario
- Underground Railroad: Language of Slavery
- Underground Railroad Studies
- Underground Railroad Timeline
- Friends of the Underground Railroad
- National Underground Railroad Freedom Center
- Underground Railroad in Buffalo and Upstate New York: A bibliography by The Buffalo History Museum
- Newspaper articles and clippings about the Underground Railroad at Newspapers.com
Civil Rights Movement, including Jim Crow Laws
- YouTube Video: History of Racial Segregation in the United States
- YouTube Video: Racial Segregation and Concentrated Poverty: The History of Housing in Black America
- YouTube Video: Jim Crow Laws and Racial Segregation in America | The Civil Rights Movement
- TOP: The Politics Of Passing 1964's Civil Rights Act: Demonstrators march down Constitution Avenue during the March on Washington on Aug. 28, 1963. Hulton Archive/ Getty Images
- BOTTOM: The Signing of THE CIVIL RIGHTS ACT OF 1964: by President Lyndon Johnson is one of the most far-reaching acts of legislation supporting racial equality in American history:
The civil rights movement was a social movement and campaign from 1954 to 1968 in the United States to abolish legalized racial segregation, discrimination, and disenfranchisement in the country.
The movement had its origins in the Reconstruction era during the late 19th century and had its modern roots in the 1940s, although the movement made its largest legislative gains in the 1960s after years of direct actions and grassroots protests.
The social movement's major nonviolent resistance and civil disobedience campaigns eventually secured new protections in federal law for the civil rights of all Americans.
After the American Civil War and the subsequent abolition of slavery in the 1860s, the Reconstruction Amendments to the United States Constitution granted emancipation and constitutional rights of citizenship to all African Americans, most of whom had recently been enslaved.
For a short period of time, African-American men voted and held political office, but as time went on Blacks were increasingly deprived of civil rights, often under the racist Jim Crow laws (see next topic below), and African Americans were subjected to discrimination and sustained violence by white supremacists in the South.
Over the following century, various efforts were made by African Americans to secure their legal and civil rights, such as the civil rights movement (1865–1896) and the civil rights movement (1896–1954).
The movement was characterized by nonviolent mass protests and civil disobedience following highly publicized events such as the lynching of Emmett Till. These included boycotts such as the Montgomery bus boycott, "sit-ins" in Greensboro and Nashville, a series of protests during the Birmingham campaign, and a march from Selma to Montgomery.
At the culmination of a legal strategy pursued by African Americans, in 1954 the Supreme Court struck down the underpinnings of laws that had allowed racial segregation and discrimination to be legal in the United States as unconstitutional.
The Warren Court made a series of landmark rulings against racist discrimination, including:
The rulings played a crucial role in bringing an end to the segregationist Jim Crow laws prevalent in the Southern states. In the 1960s, moderates in the movement worked with the United States Congress to achieve the passage of several significant pieces of federal legislation that authorized oversight and enforcement of civil rights laws.
The Civil Rights Act of 1964 explicitly banned all discrimination based on race, including:
The Voting Rights Act of 1965 restored and protected voting rights by authorizing federal oversight of registration and elections in areas with historic under-representation of minority voters.
The Fair Housing Act of 1968 banned discrimination in the sale or rental of housing.
African Americans re-entered politics in the South, and young people across the country began to take action.
From 1964 through 1970, a wave of riots and protests in black communities dampened support from the white middle class, but increased support from private foundations:
The emergence of the Black Power movement, which lasted from 1965 to 1975, challenged Black leaders of the movement for its cooperative attitude and its adherence to legalism and nonviolence.
Its leaders demanded not only legal equality, but also economic self-sufficiency for the community. Support for the Black Power movement came from African Americans who had seen little material improvement since the civil rights movement's peak in the mid-1960s, and still faced discrimination in jobs, housing, education and politics.
Many popular representations of the civil rights movement are centered on the charismatic leadership and philosophy of Martin Luther King Jr., who won the 1964 Nobel Peace Prize for combatting racial inequality through nonviolent resistance. However, some scholars note that the movement was too diverse to be credited to any particular person, organization, or strategy.
Click on any of the following blue hyperlinks for more about the Civil Rights Movement:
Jim Crow Laws:
The Jim Crow laws were state and local laws introduced in the Southern United States in the late 19th and early 20th centuries that enforced racial segregation, "Jim Crow" being a pejorative term for an African American.
Such laws remained in force until 1965. Formal and informal segregation policies were present in other areas of the United States as well, even as several states outside the South had banned discrimination in public accommodations and voting.
Southern laws were enacted by white-dominated state legislatures (see "Redeemers") to disenfranchise and remove political and economic gains made by African Americans during the Reconstruction era. Such continuing racial segregation was also supported by the successful Lily-white movement.
In practice, Jim Crow laws mandated racial segregation in all public facilities in the states of the former Confederate States of America and in some others, beginning in the 1870s.
Jim Crow laws were upheld in 1896 in the case of Plessy v. Ferguson, in which the Supreme Court laid out its "separate but equal" legal doctrine concerning facilities for African Americans.
Moreover, public education had essentially been segregated since its establishment in most of the South after the Civil War in 1861–1865. Companion laws excluded almost all African Americans from the vote in the South and deprived them of any representative government.
Although in theory, the "equal" segregation doctrine governed public facilities and transportation too, facilities for African Americans were consistently inferior and underfunded compared to facilities for white Americans; sometimes, there were no facilities for the black community at all.
Far from equality, as a body of law, Jim Crow institutionalized economic, educational, political and social disadvantages and second class citizenship for most African Americans living in the United States.
After the National Association for the Advancement of Colored People (NAACP) was founded in 1909, it became involved in a sustained public protest and campaigns against the Jim Crow laws, and the so-called "separate but equal" doctrine.
In 1954, segregation of public schools (state-sponsored) was declared unconstitutional by the U.S. Supreme Court in the landmark case Brown v. Board of Education of Topeka.
In some states, it took many years to implement this decision, while the Warren Court continued to rule against Jim Crow legislation in other cases such as Heart of Atlanta Motel, Inc. v. United States (1964). In general, the remaining Jim Crow laws were overturned by the Civil Rights Act of 1964 and the Voting Rights Act of 1965.
Etymology:
The earliest known use of the phrase "Jim Crow law" can be dated to 1884 in a newspaper article summarizing congressional debate. The term appears in 1892 in the title of a New York Times article about Louisiana requiring segregated railroad cars.
The origin of the phrase "Jim Crow" has often been attributed to "Jump Jim Crow", a song-and-dance caricature of black people performed by white actor Thomas D. Rice in blackface, first performed in 1828.
As a result of Rice's fame, Jim Crow had become by 1838 a pejorative expression meaning "Negro". When southern legislatures passed laws of racial segregation directed against African Americans at the end of the 19th century, these statutes became known as Jim Crow laws.
Click on any of the following blue hyperlinks for more aboout Jim Crow Laws:
The movement had its origins in the Reconstruction era during the late 19th century and had its modern roots in the 1940s, although the movement made its largest legislative gains in the 1960s after years of direct actions and grassroots protests.
The social movement's major nonviolent resistance and civil disobedience campaigns eventually secured new protections in federal law for the civil rights of all Americans.
After the American Civil War and the subsequent abolition of slavery in the 1860s, the Reconstruction Amendments to the United States Constitution granted emancipation and constitutional rights of citizenship to all African Americans, most of whom had recently been enslaved.
For a short period of time, African-American men voted and held political office, but as time went on Blacks were increasingly deprived of civil rights, often under the racist Jim Crow laws (see next topic below), and African Americans were subjected to discrimination and sustained violence by white supremacists in the South.
Over the following century, various efforts were made by African Americans to secure their legal and civil rights, such as the civil rights movement (1865–1896) and the civil rights movement (1896–1954).
The movement was characterized by nonviolent mass protests and civil disobedience following highly publicized events such as the lynching of Emmett Till. These included boycotts such as the Montgomery bus boycott, "sit-ins" in Greensboro and Nashville, a series of protests during the Birmingham campaign, and a march from Selma to Montgomery.
At the culmination of a legal strategy pursued by African Americans, in 1954 the Supreme Court struck down the underpinnings of laws that had allowed racial segregation and discrimination to be legal in the United States as unconstitutional.
The Warren Court made a series of landmark rulings against racist discrimination, including:
- the separate but equal doctrine, such as Brown v. Board of Education (1954);
- and Loving v. Virginia (1967) which banned segregation in public schools and public accommodations,
- and struck down all state laws banning interracial marriage, e.g.,
The rulings played a crucial role in bringing an end to the segregationist Jim Crow laws prevalent in the Southern states. In the 1960s, moderates in the movement worked with the United States Congress to achieve the passage of several significant pieces of federal legislation that authorized oversight and enforcement of civil rights laws.
The Civil Rights Act of 1964 explicitly banned all discrimination based on race, including:
- racial segregation in schools,
- businesses,
- and in public accommodations.
The Voting Rights Act of 1965 restored and protected voting rights by authorizing federal oversight of registration and elections in areas with historic under-representation of minority voters.
The Fair Housing Act of 1968 banned discrimination in the sale or rental of housing.
African Americans re-entered politics in the South, and young people across the country began to take action.
From 1964 through 1970, a wave of riots and protests in black communities dampened support from the white middle class, but increased support from private foundations:
The emergence of the Black Power movement, which lasted from 1965 to 1975, challenged Black leaders of the movement for its cooperative attitude and its adherence to legalism and nonviolence.
Its leaders demanded not only legal equality, but also economic self-sufficiency for the community. Support for the Black Power movement came from African Americans who had seen little material improvement since the civil rights movement's peak in the mid-1960s, and still faced discrimination in jobs, housing, education and politics.
Many popular representations of the civil rights movement are centered on the charismatic leadership and philosophy of Martin Luther King Jr., who won the 1964 Nobel Peace Prize for combatting racial inequality through nonviolent resistance. However, some scholars note that the movement was too diverse to be credited to any particular person, organization, or strategy.
Click on any of the following blue hyperlinks for more about the Civil Rights Movement:
- Background
- History
- Brown v. Board of Education, 1954
- Emmett Till's murder, 1955
- Rosa Parks and the Montgomery bus boycott, 1955–1956
- Little Rock Nine, 1957
- Method of nonviolence and nonviolence training
- Sit-ins, 1958–1960
- Freedom Rides, 1961
- Voter registration organizing
- Integration of Mississippi universities, 1956–1965
- Albany Movement, 1961–1962
- Birmingham campaign, 1963
- March on Washington, 1963
- Malcolm X joins the movement, 1964–1965
- St. Augustine, Florida, 1963–1964
- Chester school protests, Spring 1964
- Freedom Summer, 1964
- Civil Rights Act of 1964
- Harlem riot of 1964
- Mississippi Freedom Democratic Party, 1964
- Selma Voting Rights Movement
- Voting Rights Act of 1965
- Watts riot of 1965
- Fair housing movements, 1966–1968
- Nationwide riots of 1967
- Memphis, King assassination, and Civil Rights Act of 1968
- Gates v. Collier
- Legacy
- Characteristics
- Political responses
- Popular reactions
- In popular culture
- Activist organizations
- Individual activists
- See also:
- Civil rights movement (1896–1954)
- Civil rights movement (1865–1896)
- American Indian Movement
- Asian American movement
- Chicano Movement
- History of civil rights in the United States
- List of civil rights leaders
- List of Kentucky women in the civil rights era
- List of photographers of the civil rights movement
- South Carolina in the civil rights movement
- Timeline of the civil rights movement
- "We Shall Overcome," the unofficial anthem of the movement
- History preservation
- Birmingham Civil Rights National Monument
- Civil Rights Movement Archive
- Freedom Riders National Monument
- Read's Drug Store (Baltimore), the site of a 1955 desegregation sit-in
- Seattle Civil Rights and Labor History Project
- Television News of the Civil Rights Era 1950–1970
- Post–civil rights movement
- Black Lives Matter
- Post–civil rights era in African-American history
- The Modern Civil Rights Movement, 1954–1964
- Information from The National Park Service
- Civil Rights in America
- Voices from the Southern Civil Rights Movement Exhibit – Provided by the American Archive of Public Broadcasting
- Civil Rights Digital Library – Provided by the Digital Library of Georgia.
- Civil Rights Movement Archive – provides movement history, personal stories, documents, and photos (hosted by Tougaloo College)
- Civil Rights Movement Timeline – Provided by History.com on December 4, 2017, and updated on January 19, 2021. Archived from the original on January 19, 2021
- Television News of the Civil Rights Era 1950–1970 – Provided by the University of Virginia.
- Provided by the Library of Congress:
- Civil Rights in America: A Resource Guide
- The Civil Rights Era – Part of The African American Odyssey: A Quest for Full Citizenship presentation.
- Voices of Civil Rights – A project with the collaboration of AARP and the Leadership Conference on Civil Rights (LCCR).
- We Shall Overcome: Historic Places of the Civil Rights Movement – Provided by the National Park Service.
- Provided by Southern Poverty Law Center:
- "Teaching the Movement: The State Standards We Deserve" – Part of "Teaching Tolerance" project published on September 19, 2011.
- "Teaching Tolerance Publishes Guide for Teaching the Civil Rights Movement" – Part of "Teaching Tolerance" project published on March 26, 2014.
- "Teaching the Movement 2014: The State of Civil Rights Education in the United States" – Part of "Teaching Tolerance" project published in 2014.
- Civil Rights Teaching – Provided by Teaching for Change, a 501(c)(3) organization.
- SNCC Digital Gateway – Profiles and primary documents on the Student Nonviolent Coordinating Committee (SNCC), the national civil rights movement organization led by young people. A project of the SNCC Legacy Project, Duke's Center for Documentary Studies, and Duke University Libraries.
- Collection: "U.S. Civil Rights Movement" from the University of Michigan Museum of Art
Jim Crow Laws:
The Jim Crow laws were state and local laws introduced in the Southern United States in the late 19th and early 20th centuries that enforced racial segregation, "Jim Crow" being a pejorative term for an African American.
Such laws remained in force until 1965. Formal and informal segregation policies were present in other areas of the United States as well, even as several states outside the South had banned discrimination in public accommodations and voting.
Southern laws were enacted by white-dominated state legislatures (see "Redeemers") to disenfranchise and remove political and economic gains made by African Americans during the Reconstruction era. Such continuing racial segregation was also supported by the successful Lily-white movement.
In practice, Jim Crow laws mandated racial segregation in all public facilities in the states of the former Confederate States of America and in some others, beginning in the 1870s.
Jim Crow laws were upheld in 1896 in the case of Plessy v. Ferguson, in which the Supreme Court laid out its "separate but equal" legal doctrine concerning facilities for African Americans.
Moreover, public education had essentially been segregated since its establishment in most of the South after the Civil War in 1861–1865. Companion laws excluded almost all African Americans from the vote in the South and deprived them of any representative government.
Although in theory, the "equal" segregation doctrine governed public facilities and transportation too, facilities for African Americans were consistently inferior and underfunded compared to facilities for white Americans; sometimes, there were no facilities for the black community at all.
Far from equality, as a body of law, Jim Crow institutionalized economic, educational, political and social disadvantages and second class citizenship for most African Americans living in the United States.
After the National Association for the Advancement of Colored People (NAACP) was founded in 1909, it became involved in a sustained public protest and campaigns against the Jim Crow laws, and the so-called "separate but equal" doctrine.
In 1954, segregation of public schools (state-sponsored) was declared unconstitutional by the U.S. Supreme Court in the landmark case Brown v. Board of Education of Topeka.
In some states, it took many years to implement this decision, while the Warren Court continued to rule against Jim Crow legislation in other cases such as Heart of Atlanta Motel, Inc. v. United States (1964). In general, the remaining Jim Crow laws were overturned by the Civil Rights Act of 1964 and the Voting Rights Act of 1965.
Etymology:
The earliest known use of the phrase "Jim Crow law" can be dated to 1884 in a newspaper article summarizing congressional debate. The term appears in 1892 in the title of a New York Times article about Louisiana requiring segregated railroad cars.
The origin of the phrase "Jim Crow" has often been attributed to "Jump Jim Crow", a song-and-dance caricature of black people performed by white actor Thomas D. Rice in blackface, first performed in 1828.
As a result of Rice's fame, Jim Crow had become by 1838 a pejorative expression meaning "Negro". When southern legislatures passed laws of racial segregation directed against African Americans at the end of the 19th century, these statutes became known as Jim Crow laws.
Click on any of the following blue hyperlinks for more aboout Jim Crow Laws:
- Origins
- Historical development
- Decline and removal
- Influence and aftermath
- Remembrance
- See also:
- Anti-miscegenation laws
- Apartheid
- Black Codes in the United States
- Disenfranchisement after the Reconstruction era
- Group Areas Act
- Jim Crow economy
- List of Jim Crow law examples by state
- Lynching
- Mass racial violence in the United States
- Penal labor
- Racial segregation in the United States
- Racism in the United States
- Second-class citizen
- Sundown town
- Timeline of the civil rights movement
- The New Jim Crow
- The Jim Crow Museum of Racist Memorabilia
- Jim Crow and Reconstruction
- The History of Jim Crow, Ronald L. F. Davis – A series of essays on the history of Jim Crow. Archive index at the Wayback Machine
- Creating Jim Crow – Origins of the term and system of laws.
- Racial Etiquette: The Racial Customs and Rules of Racial Behavior in Jim Crow America – The basics of Jim Crow etiquette.
- "You Don't Have to Ride Jim Crow!" PBS documentary on first Freedom Ride, in 1947.
- List of laws enacted in various states
- Jim Crow Era, History in the Key of Jazz, Gerald Early, Washington University in St. Louis, Missouri (esp. see section "Jim Crow is Born")
- "Jim Crow Laws". National Park Service. Retrieved November 17, 2010. Examples of Jim Crow laws
- Reports of the Death of Jim Crow Prove Greatly Exaggerated. Bill Morris, The Daily Beast.
- Jim Crow Signs at A History of Central Florida Podcast
- Black Justice - American Civil Liberties Union, 1931
Harlem Renaissance
TOP: Three African American women in Harlem during the Harlem Renaissance, ca. 1925 (By Public Domain - http://www.blackpast.org/perspectives/passing-passing-peculiarly-american-racial-tradition-approaches-irrelevance, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=61998651)
BOTTOM: Black ThenThe Black & Beautiful Club: Fabulous Fashion Of "The Harlem Renaissance" (More details)
- YouTube Video: The Harlem Renaissance: A Period of Radical Change
- YouTube Video: The Great Migration & The Harlem Renaissance
- YouTube Video: Why African-Americans left the south in droves — and what's bringing them back
TOP: Three African American women in Harlem during the Harlem Renaissance, ca. 1925 (By Public Domain - http://www.blackpast.org/perspectives/passing-passing-peculiarly-american-racial-tradition-approaches-irrelevance, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=61998651)
BOTTOM: Black ThenThe Black & Beautiful Club: Fabulous Fashion Of "The Harlem Renaissance" (More details)
The Harlem Renaissance was an intellectual and cultural revival of African American music, dance, art, fashion, literature, theater, politics and scholarship centered in Harlem, Manhattan, New York City, spanning the 1920s and 1930s.
At the time, it was known as the "New Negro Movement", named after The New Negro, a 1925 anthology edited by Alain Locke.
The movement also included the new African American cultural expressions across the urban areas in the Northeast and Midwest United States affected by a renewed militancy in the general struggle for civil rights, combined with the Great Migration of African American workers fleeing the racist conditions of the Jim Crow Deep South, as Harlem was the final destination of the largest number of those who migrated north.
Though it was centered in the Harlem neighborhood, many francophone black writers from African and Caribbean colonies who lived in Paris were also influenced by the movement, which spanned from about 1918 until the mid-1930s. Many of its ideas lived on much longer.
The zenith of this "flowering of Negro literature", as James Weldon Johnson preferred to call the Harlem Renaissance, took place between 1924—when Opportunity: A Journal of Negro Life hosted a party for black writers where many white publishers were in attendance—and 1929, the year of the stock-market crash and the beginning of the Great Depression.
The Harlem Renaissance is considered to have been a rebirth of the African American arts.
Background
Until the end of the Civil War, the majority of African Americans had been enslaved and lived in the South. During the Reconstruction Era, the emancipated African Americans began to strive for civic participation, political equality, and economic and cultural self-determination.
Soon after the end of the Civil War, the Ku Klux Klan Act of 1871 gave rise to speeches by African American congressmen addressing this bill By 1875, sixteen African Americans had been elected and served in Congress and gave numerous speeches with their newfound civil empowerment.
The Ku Klux Klan Act of 1871 was followed by the passage of the Civil Rights Act of 1875, part of Reconstruction legislation by Republicans. During the mid-to-late 1870s, racist whites organized in the Democratic Party launched a murderous campaign of racist terrorism to regain political power throughout the South.
From 1890 to 1908, they proceeded to pass legislation that disenfranchised most African Americans and many poor whites, trapping them without representation. They established white supremacist regimes of Jim Crow segregation in the South and one-party block voting behind Southern Democrats.
Democratic Party politicians (many having been former slaveowners and political and military leaders of the Confederacy) conspired to deny African Americans their exercise of civil and political rights by terrorizing black communities with lynch mobs and other forms of vigilante violence as well as by instituting a convict labor system that forced many thousands of African Americans back into unpaid labor in mines, plantations and on public works projects such as roads and levees.
Convict laborers were typically subject to brutal forms of corporal punishment, overwork and disease from unsanitary conditions. Death rates were extraordinarily high. While a small number of African Americans were able to acquire land shortly after the Civil War, most were exploited as sharecroppers.
Whether sharecropping or on their own acreage, most of the black population was closely financially dependent on agriculture. This added another impetus for the Migration: The arrival of the boll weevil. The beetle eventually came to waste 8% of the country's cotton yield annually and thus disproportionately impacted this part of America's citizenry.
As life in the South became increasingly difficult, African Americans began to migrate north in great numbers.
Most of the future leading lights of what was to become known as the "Harlem Renaissance" movement arose from a generation that had memories of the gains and losses of Reconstruction after the Civil War. Sometimes their parents, grandparents – or they themselves – had been slaves. Their ancestors had sometimes benefited by paternal investment in cultural capital, including better-than-average education.
Many in the Harlem Renaissance were part of the early 20th century Great Migration out of the South into the African American neighborhoods of the Northeast and Midwest. African Americans sought a better standard of living and relief from the institutionalized racism in the South.
Others were people of African descent from racially stratified communities in the Caribbean who came to the United States hoping for a better life. Uniting most of them was their convergence in Harlem.
DevelopmentDuration: 15 minutes and 11 seconds.15:11
During the early portion of the 20th century, Harlem was the destination for migrants from around the country, attracting both people from the South seeking work and an educated class who made the area a center of culture, as well as a growing "Negro" middle class.
These people were looking for a fresh start in life and this was a good place to go. The district had originally been developed in the 19th century as an exclusive suburb for the white middle and upper middle classes; its affluent beginnings led to the development of stately houses, grand avenues, and world-class amenities such as the Polo Grounds and the Harlem Opera House.
During the enormous influx of European immigrants in the late 19th century, the once exclusive district was abandoned by the white middle class, who moved farther north.
Harlem became an African American neighborhood in the early 1900s. In 1910, a large block along 135th Street and Fifth Avenue was bought by various African American realtors and a church group.
Many more African Americans arrived during the First World War. Due to the war, the migration of laborers from Europe virtually ceased, while the war effort resulted in a massive demand for unskilled industrial labor. The Great Migration brought hundreds of thousands of African Americans to cities such as Chicago, Philadelphia, Detroit, Washington, D.C., and New York.
Despite the increasing popularity of Negro culture, virulent white racism, often by more recent ethnic immigrants, continued to affect African American communities, even in the North.
After the end of World War I, many African American soldiers—who fought in segregated units such as the Harlem Hellfighters—came home to a nation whose citizens often did not respect their accomplishments. Race riots and other civil uprisings occurred throughout the United States during the Red Summer of 1919, reflecting economic competition over jobs and housing in many cities, as well as tensions over social territories.
Mainstream recognition of Harlem culture:
The first stage of the Harlem Renaissance started in the late 1910s. In 1917, the premiere of Granny Maumee, The Rider of Dreams, and Simon the Cyrenian: Plays for a Negro Theater took place.
These plays, written by white playwright Ridgely Torrence, featured African American actors conveying complex human emotions and yearnings. They rejected the stereotypes of the blackface and minstrel show traditions. In 1917, James Weldon Johnson called the premieres of these plays "the most important single event in the entire history of the Negro in the American Theater".
Another landmark came in 1919, when the communist poet Claude McKay published his militant sonnet "If We Must Die", which introduced a dramatically political dimension to the themes of African cultural inheritance and modern urban experience featured in his 1917 poems "Invocation" and "Harlem Dancer".
Published under the pseudonym Eli Edwards, these were his first appearance in print in the United States after immigrating from Jamaica. Although "If We Must Die" never alluded to race, African American readers heard its note of defiance in the face of racism and the nationwide race riots and lynchings then taking place.
By the end of the First World War, the fiction of James Weldon Johnson and the poetry of Claude McKay were describing the reality of contemporary African American life in America.
The Harlem Renaissance grew out of the changes that had taken place in the African American community since the abolition of slavery, as the expansion of communities in the North.
These accelerated as a consequence of World War I and the great social and cultural changes in the early 20th-century United States. Industrialization attracted people from rural areas to cities and gave rise to a new mass culture. Contributing factors leading to the Harlem Renaissance were the Great Migration of African Americans to Northern cities, which concentrated ambitious people in places where they could encourage each other, and the First World War, which had created new industrial work opportunities for tens of thousands of people.
Factors leading to the decline of this era include the Great Depression.
Literature:
In 1917, Hubert Harrison, "The Father of Harlem Radicalism", founded the Liberty League and The Voice, the first organization and the first newspaper, respectively, of the "New Negro Movement". Harrison's organization and newspaper were political but also emphasized the arts (his newspaper had "Poetry for the People" and book review sections).
In 1927, in the Pittsburgh Courier, Harrison challenged the notion of the Renaissance. He argued that the "Negro Literary Renaissance" notion overlooked "the stream of literary and artistic products which had flowed uninterruptedly from Negro writers from 1850 to the present," and said the so-called "Renaissance" was largely a white invention.
Alternatively, a writer like the Chicago-based author, Fenton Johnson, who began publishing in the early 1900s, is called a "forerunner" of the Harlem Renaissance, "one of the first negro revolutionary poets"
Nevertheless, with the Harlem Renaissance came a sense of acceptance for African American writers; as Langston Hughes put it, with Harlem came the courage "to express our individual dark-skinned selves without fear or shame."
Alain Locke's anthology The New Negro was considered the cornerstone of this cultural revolution. The anthology featured several African American writers and poets, from the well-known, such as Zora Neale Hurston and communists Langston Hughes and Claude McKay, to the lesser known, like the poet Anne Spencer.
Many poets of the Harlem Renaissance were inspired to tie threads of African American culture into their poems; as a result, jazz poetry was heavily developed during this time. "The Weary Blues" was a notable jazz poem written by Langston Hughes.
Through their works of literature, black authors were able to give a voice to the African American identity, and strived for a community of support and acceptance.
Religion:
Christianity played a major role in the Harlem Renaissance. Many of the writers and social critics discussed the role of Christianity in African American lives. For example, a famous poem by ub, "Madam and the Minister", reflects the temperature and mood towards religion in the Harlem Renaissance.
The cover story for The Crisis magazine's publication in May 1936 explains how important Christianity was regarding the proposed union of the three largest Methodist churches of 1936. This article shows the controversial question of unification for these churches.
The article "The Catholic Church and the Negro Priest", also published in The Crisis, January 1920, demonstrates the obstacles that African American priests faced in the Catholic Church.
The article confronts what it saw as policies based on race that excluded African Americans from higher positions in the Church.
Discourse:
Various forms of religious worship existed during this time of African American intellectual reawakening.
Although there were racist attitudes within the current Abrahamic religious arenas, many African Americans continued to push towards the practice of a more inclusive doctrine. For example, George Joseph MacWilliam presents various experiences of rejection on the basis of his color and race during his pursuit towards priesthood, yet he shares his frustration in attempts to incite action on the part of The Crisis magazine community.
There were other forms of spiritualism practiced among African Americans during the Harlem Renaissance. Some of these religions and philosophies were inherited from African ancestry. For example, the religion of Islam was present in Africa as early as the 8th century through the Trans-Saharan trade. Islam came to Harlem likely through the migration of members of the Moorish Science Temple of America, which was established in 1913 in New Jersey.
Various forms of Judaism were practiced, including Orthodox, Conservative and Reform Judaism, but it was Black Hebrew Israelites that founded their religious belief system during the early 20th century in the Harlem Renaissance.
Traditional forms of religion acquired from various parts of Africa were inherited and practiced during this era. Some common examples were Voodoo and Santeria.
Criticism:
Religious critique during this era was found in music, literature, art, theater and poetry. The Harlem Renaissance encouraged analytic dialogue that included the open critique and the adjustment of current religious ideas.
One of the major contributors to the discussion of African American renaissance culture was Aaron Douglas, who, with his artwork, also reflected the revisions African Americans were making to the Christian dogma. Douglas uses biblical imagery as inspiration to various pieces of artwork, but with the rebellious twist of an African influence.
Countee Cullen's poem "Heritage" expresses the inner struggle of an African American between his past African heritage and the new Christian culture A more severe criticism of the Christian religion can be found in Langston Hughes's poem "Merry Christmas", where he exposes the irony of religion as a symbol for good and yet a force for oppression and injustice.
Music:
A new way of playing the piano called the Harlem Stride style was created during the Harlem Renaissance helping to blur the lines between the poor African Americans and socially elite African Americans.
The traditional jazz band was composed primarily of brass instruments and was considered a symbol of the South, but the piano was considered an instrument of the wealthy. With this instrumental modification to the existing genre, the wealthy African Americans now had more access to jazz music. Its popularity soon spread throughout the country and was consequently at an all-time high.
Innovation and liveliness were important characteristics of performers in the beginnings of jazz.
Jazz performers and composers at the time such as:
The above were extremely talented, skillful, competitive and inspirational. They laid great parts of the foundations for future musicians of their genre.
Duke Ellington gained popularity during the Harlem Renaissance. According to Charles Garrett, "The resulting portrait of Ellington reveals him to be not only the gifted composer, bandleader, and musician we have come to know, but also an earthly person with basic desires, weaknesses, and eccentricities."
Ellington did not let his popularity get to him. He remained calm and focused on his music.
During this period, the musical style of blacks was becoming more and more attractive to whites. White novelists, dramatists and composers started to exploit the musical tendencies and themes of African Americans in their works.
Composers (including William Grant Still, William L. Dawson and Florence Price) used poems written by African American poets in their songs, and would implement the rhythms, harmonies and melodies of African American music—such as blues, spirituals and jazz—into their concert pieces.
African Americans began to merge with whites into the classical world of musical composition.
The first African American male to gain wide recognition as a concert artist in both his region and internationally was Roland Hayes. He trained with Arthur Calhoun in Chattanooga, and at Fisk University in Nashville. Later, he studied with Arthur Hubbard in Boston and with George Henschel and Amanda Ira Aldridge in London, England. Hayes began singing in public as a student, and he toured with the Fisk Jubilee Singers in 1911.
Musical theatre:
According to James Vernon Hatch and Leo Hamalian, all-black review, Run, Little Chillun, is considered one of the most successful musical dramas of the Harlem Renaissance.
Fashion:
During the Harlem Renaissance, the African American clothing scene took a dramatic turn from the prim and proper many young women preferred, from short skirts and silk stockings to drop-waisted dresses and cloche hats.
Women wore loose-fitted garments and accessorized with long strand pearl bead necklaces, feather boas, and cigarette holders. The fashion of the Harlem Renaissance was used to convey elegance and flamboyancy and needed to be created with the vibrant dance style of the 1920s in mind.
Popular by the 1930s was a trendy, egret-trimmed beret.
Men wore loose suits that led to the later style known as the "Zoot", which consisted of wide-legged, high-waisted, peg-top trousers, and a long coat with padded shoulders and wide lapels.
Men also wore wide-brimmed hats, colored socks, white gloves and velvet-collared Chesterfield coats. During this period, African Americans expressed respect for their heritage through a fad for leopard-skin coats, indicating the power of the African animal.
While performing in Paris during the height of the Renaissance, the extraordinarily successful black dancer Josephine Baker was a major fashion trendsetter for black and white women alike. Her gowns from the couturier Jean Patou were copied, especially her stage costumes, which Vogue magazine called "startling".
Josephine Baker is also credited for highlighting the "art deco" fashion era after she performed the "Danse Sauvage". During this Paris performance, she adorned a skirt made of string and artificial bananas. Ethel Moses was another popular black performer. Moses starred in silent films in the 1920s and 1930s and was recognizable by her signature bob hairstyle.
Photography:
James Van Der Zee's photography played an important role in shaping and documenting the cultural and social life of Harlem during the Harlem Renaissance. His photographs were instrumental in shaping the image and identity of the African American community during the Harlem Renaissance. His work documented the achievements of cultural figures and helped to challenge stereotypes and racist attitudes which in turn promoted pride and dignity among African Americans in Harlem and beyond.
Van Der Zee's studio was not just a place for taking photographs; it was also a social and cultural hub for Harlem residents. People would come to his studio not only to have their portraits taken, but also to socialize and to participate in the community events that he hosted. Van Der Zee's studio played an important role in the cultural life of Harlem during the early 20th century, and helped to foster a sense of community and pride among its residents.
Some notable persons photographed are Marcus Garvey, the leader of the Universal Negro Improvement Association (UNIA), a black nationalist organization that promoted Pan-Africanism and economic independence for African Americans.
Other notable black persons he photographed are:
Van Der Zee's work gained renewed attention in the 1960s and 1970s, when interest in the Harlem Renaissance was revived. Van Der Zee's photographs have been featured in numerous exhibitions over the years.
One notable exhibition was "Harlem on My Mind: Cultural Capital of Black America, 1900-1968," which was organized by the Metropolitan Museum of Art in 1969. The exhibit included over 300 photographs, many of which were by Van Der Zee, and was one of the first major exhibitions to focus on the cultural achievements of African Americans in Harlem.
Van Der Zee's work was the eyes of Harlem. His photographs are recognized as important documents of African American life and culture during the early 20th century.
They serve as a visual record of the achievements of the Harlem Renaissance. His portraits of writers, musicians, artists and other cultural figures helped to promote their work and bring attention to the vibrant creative scene known as Harlem.
Characteristics and themes:
Characterizing the Harlem Renaissance was an overt racial pride that came to be represented in the idea of the New Negro, who through intellect and production of literature, art and music could challenge the pervading racism and stereotypes to promote progressive or socialist politics, and racial and social integration.
The creation of art and literature would serve to "uplift" the race.
There would be no uniting form singularly characterizing the art that emerged from the Harlem Renaissance. Rather, it encompassed a wide variety of cultural elements and styles, including a Pan-African perspective, "high-culture" and "low-culture" or "low-life", from the traditional form of music to the blues and jazz, traditional and new experimental forms in literature such as modernism and the new form of jazz poetry.
This duality meant that numerous African American artists came into conflict with conservatives in the black intelligentsia, who took issue with certain depictions of black life.
Some common themes represented during the Harlem Renaissance were the influence of the experience of slavery and emerging African American folk traditions on black identity, the effects of institutional racism, the dilemmas inherent in performing and writing for elite white audiences, and the question of how to convey the experience of modern black life in the urban North.
The Harlem Renaissance was one of primarily African American involvement. It rested on a support system of black patrons and black-owned businesses and publications. However, it also depended on the patronage of white Americans, such as Carl Van Vechten and Charlotte Osgood Mason, who provided various forms of assistance, opening doors which otherwise might have remained closed to the publication of work outside the black American community.
This support often took the form of patronage or publication. Carl Van Vechten was one of the most noteworthy white Americans involved with the Harlem Renaissance. He allowed for assistance to the black American community because he wanted racial sameness.
There were other whites interested in so-called "primitive" cultures, as many whites viewed black American culture at that time, and wanted to see such "primitivism" in the work coming out of the Harlem Renaissance. As with most fads, some people may have been exploited in the rush for publicity.
Interest in African American lives also generated experimental but lasting collaborative work, such as the all-black productions of George Gershwin's opera Porgy and Bess, and Virgil Thomson and Gertrude Stein's Four Saints in Three Acts.
In both productions the choral conductor Eva Jessye was part of the creative team. Her choir was featured in Four Saints. The music world also found white band leaders defying racist attitudes to include the best and the brightest African-American stars of music and song in their productions.
The African Americans used art to prove their humanity and demand for equality. The Harlem Renaissance led to more opportunities for blacks to be published by mainstream houses. Many authors began to publish novels, magazines and newspapers during this time.
The new fiction attracted a great amount of attention from the nation at large. Among authors who became nationally known were:
Richard Bruce Nugent (1906–1987), who wrote "Smoke, Lilies, and Jade", made an important contribution, especially in relation to experimental form and LGBT themes in the period.
The Harlem Renaissance helped lay the foundation for the post-World War II protest movement of the Civil Rights movement. Moreover, many black artists who rose to creative maturity afterward were inspired by this literary movement.
The Renaissance was more than a literary or artistic movement, as it possessed a certain sociological development—particularly through a new racial consciousness—through ethnic pride, as seen in the Back to Africa movement led by Jamaican Marcus Garvey.
At the same time, a different expression of ethnic pride, promoted by W. E. B. Du Bois, introduced the notion of the "talented tenth".
Du Bois wrote of the Talented Tenth:"The Negro race, like all races, is going to be saved by its exceptional men. The problem of education, then, among Negroes must first of all deal with the Talented Tenth; it is the problem of developing the best of this race that they may guide the mass away from the contamination and death of the worst."
These "talented tenth" were considered the finest examples of the worth of black Americans as a response to the rampant racism of the period. No particular leadership was assigned to the talented tenth, but they were to be emulated.
In both literature and popular discussion, complex ideas such as Du Bois's concept of "twoness" (dualism) were introduced (see The Souls of Black Folk; 1903). Du Bois explored a divided awareness of one's identity that was a unique critique of the social ramifications of racial consciousness. This exploration was later revived during the Black Pride movement of the early 1970s.
Influence:
A new black identity:
The Harlem Renaissance was successful in that it brought the black experience clearly within the corpus of American cultural history. Not only through an explosion of culture, but on a sociological level, the legacy of the Harlem Renaissance redefined how America, and the world, viewed African Americans.
The migration of Southern blacks to the North changed the image of the African American from rural, undereducated peasants to one of urban, cosmopolitan sophistication. This new identity led to a greater social consciousness, and African Americans became players on the world stage, expanding intellectual and social contacts internationally.
The progress—both symbolic and real—during this period became a point of reference from which the African American community gained a spirit of self-determination that provided a growing sense of both black urbanity and black militancy, as well as a foundation for the community to build upon for the Civil Rights struggles in the 1950s and 1960s.
The urban setting of rapidly developing Harlem provided a venue for African Americans of all backgrounds to appreciate the variety of black life and culture. Through this expression, the Harlem Renaissance encouraged the new appreciation of folk roots and culture.
For instance, folk materials and spirituals provided a rich source for the artistic and intellectual imagination, which freed blacks from the establishment of past condition.
Through sharing in these cultural experiences, a consciousness sprung forth in the form of a united racial identity.
However, there was some pressure within certain groups of the Harlem Renaissance to adopt sentiments of conservative white America in order to be taken seriously by the mainstream.
The result being that queer culture, while far-more accepted in Harlem than most places in the country at the time, was most fully lived out in the smoky dark lights of bars, nightclubs and cabarets in the city. It was within these venues that the blues music scene boomed, and, since it had not yet gained recognition within popular culture, queer artists used it as a way to express themselves honestly.
Even though there were factions within the Renaissance that were accepting of queer culture/lifestyles, one could still be arrested for engaging in homosexual acts. Many people, including author Alice Dunbar Nelson and "The Mother of Blues" Gertrude "Ma" Rainey, had husbands but were romantically linked to other women as well.
Women and the LGBTQ community:
During the Harlem Renaissance, various well-known figures, including Claude Mckay, Langston Hughes, and Ethel Waters, are believed to have had private same-gender relationships, although this aspect of their lives remained undisclosed to the public during that era.
In the Harlem music scene, places such as the Cotton Club and Rockland Palace routinely held gay drag shows in addition to straight performances. Lesbian or bisexual women performers, such as blues singers Gladys Bentley and Bessie Smith, were a part of this cultural movement, which contributed to a renewed interest in African American culture among the black community and introduced it to a wider audience.
Although women's contributions to culture were often overlooked at the time, contemporary black feminist critics have endeavored to re-evaluate and recognize the cultural production of women during the Harlem Renaissance.
Authors such as Nella Larsen and Jessie Fauset have gained renewed critical acclaim for their work from modern perspectives.
Blues singer Gertrude "Ma" Rainey was known to dress in traditionally male clothing, and her blues lyrics often reflected her sexual proclivities for women, which was extremely radical at the time. Ma Rainey was also the first person to introduce blues music into vaudeville.
Rainey's protégé, Bessie Smith, was another artist who used the blues as a way to express unapologetic views on same-gender relations, with such lines as "When you see two women walking hand in hand, just look em' over and try to understand: They'll go to those parties – have the lights down low – only those parties where women can go."
Rainey, Smith, and artist Lucille Bogan were collectively known as "The Big Three of the Blues." Another prominent blues singer was Gladys Bentley, who was known to cross-dress. Bentley was the club owner of Clam House on 133rd Street in Harlem, which was a hub for queer patrons. The Hamilton Lodge in Harlem hosted an annual drag ball, drawing thousands of people to watch young men dance in drag.
Though there were safe spaces within Harlem, there were prominent voices, such as that of Abyssinian Baptist Church's minister Adam Clayton Powell Sr., who actively opposed homosexuality.
The Harlem Renaissance was instrumental in fostering the "New Negro" movement, an endeavor by African Americans to redefine their identity free from degrading stereotypes. The Neo-New Negro movement further challenged racial definitions, stereotypes, and gender norms and roles, seeking to address normative sexuality and sexism in American society.
These ideas received some pushback, particularly regarding sexual freedom for women, which was seen as confirming the stereotype that black women were sexually uninhibited.
Some members of the black bourgeoisie saw this as hindering the overall progress of the black community and fueling racist sentiments. Yet queer culture and artists defined major portions of the Harlem Renaissance; Henry Louis Gates Jr., the author of "The Black Man's Burden", wrote that the Harlem Renaissance "was surely as gay as it was black".
Criticism of the movement:
Many critics point out that the Harlem Renaissance could not escape its history and culture in its attempt to create a new one, or sufficiently separate from the foundational elements of white, European culture.
Often Harlem intellectuals, while proclaiming a new racial consciousness, resorted to mimicry of their white counterparts by adopting their clothing, sophisticated manners and etiquette. This "mimicry" may also be called assimilation, as that is typically what minority members of any social construct must do in order to fit social norms created by that construct's majority.
This could be seen as a reason that the artistic and cultural products of the Harlem Renaissance did not overcome the presence of white-American values and did not reject these values. In this regard, the creation of the "New Negro", as the Harlem intellectuals sought, was considered a success.
The Harlem Renaissance appealed to a mixed audience. The literature appealed to the African American middle class and to whites. Magazines such as The Crisis, a monthly journal of the NAACP, and Opportunity, an official publication of the National Urban League, employed Harlem Renaissance writers on their editorial staffs, published poetry and short stories by black writers, and promoted African American literature through articles, reviews and annual literary prizes.
However, as important as these literary outlets were, the Renaissance relied heavily on white publishing houses and white-owned magazines.
A major accomplishment of the Renaissance was to open the door to mainstream white periodicals and publishing houses, although the relationship between the Renaissance writers and white publishers and audiences created some controversy.
W. E. B. Du Bois did not oppose the relationship between black writers and white publishers, but he was critical of works such as Claude McKay's bestselling novel Home to Harlem (1928) for appealing to the "prurient demand[s]" of white readers and publishers for portrayals of black "licentiousness".
Langston Hughes spoke for most of the writers and artists when he wrote in his essay "The Negro Artist and the Racial Mountain" (1926) that black artists intended to express themselves freely, no matter what the black public or white public thought.
Hughes in his writings also returned to the theme of racial passing, but, during the Harlem Renaissance, he began to explore the topic of homosexuality and homophobia. He began to use disruptive language in his writings. He explored this topic because it was a theme that during this time period was not discussed.
African American musicians and writers were among mixed audiences as well, having experienced positive and negative outcomes throughout the New Negro Movement. For musicians, Harlem, New York's cabarets and nightclubs shined a light on black performers and allowed for black residents to enjoy music and dancing.
However, some of the most popular clubs (that showcased black musicians) were exclusively for white audiences; one of the most famous white-only nightclubs in Harlem was the Cotton Club, where popular black musicians like Duke Ellington frequently performed. Ultimately, the black musicians who appeared at these white-only clubs became far more successful and became a part of the mainstream music scene.
Similarly, black writers were given the opportunity to shine once the New Negro Movement gained traction as short stories, novels and poems by black authors began taking form and getting into various print publications in the 1910s and 1920s. Although a seemingly good way to establish their identities and culture, many authors note how hard it was for any of their work to actually go anywhere.
Writer Charles Chesnutt in 1877, for example, notes that there was no indication of his race alongside his publication in Atlantic Monthly (at the publisher's request).
A prominent factor in the New Negro's struggle was that their work had been made out to be "different" or "exotic" to white audiences, making a necessity for black writers to appeal to them and compete with each other to get their work out.
Famous black author and poet Langston Hughes explained that black-authored works were placed in a similar fashion to those of oriental or foreign origin, only being used occasionally in comparison to their white-made counterparts: Once a spot for a black work was "taken", black authors had to look elsewhere to publish.
Certain aspects of the Harlem Renaissance were accepted without debate, and without scrutiny. One of these was the future of the "New Negro". Artists and intellectuals of the Harlem Renaissance echoed American progressivism in its faith in democratic reform, in its belief in art and literature as agents of change, and in its almost uncritical belief in itself and its future.
This progressivist worldview rendered black intellectuals—just like their white counterparts—unprepared for the rude shock of the Great Depression, and the Harlem Renaissance ended abruptly because of naïve assumptions about the centrality of culture, unrelated to economic and social realities.
Works associated with the Harlem Renaissance
See Also:
At the time, it was known as the "New Negro Movement", named after The New Negro, a 1925 anthology edited by Alain Locke.
The movement also included the new African American cultural expressions across the urban areas in the Northeast and Midwest United States affected by a renewed militancy in the general struggle for civil rights, combined with the Great Migration of African American workers fleeing the racist conditions of the Jim Crow Deep South, as Harlem was the final destination of the largest number of those who migrated north.
Though it was centered in the Harlem neighborhood, many francophone black writers from African and Caribbean colonies who lived in Paris were also influenced by the movement, which spanned from about 1918 until the mid-1930s. Many of its ideas lived on much longer.
The zenith of this "flowering of Negro literature", as James Weldon Johnson preferred to call the Harlem Renaissance, took place between 1924—when Opportunity: A Journal of Negro Life hosted a party for black writers where many white publishers were in attendance—and 1929, the year of the stock-market crash and the beginning of the Great Depression.
The Harlem Renaissance is considered to have been a rebirth of the African American arts.
Background
Until the end of the Civil War, the majority of African Americans had been enslaved and lived in the South. During the Reconstruction Era, the emancipated African Americans began to strive for civic participation, political equality, and economic and cultural self-determination.
Soon after the end of the Civil War, the Ku Klux Klan Act of 1871 gave rise to speeches by African American congressmen addressing this bill By 1875, sixteen African Americans had been elected and served in Congress and gave numerous speeches with their newfound civil empowerment.
The Ku Klux Klan Act of 1871 was followed by the passage of the Civil Rights Act of 1875, part of Reconstruction legislation by Republicans. During the mid-to-late 1870s, racist whites organized in the Democratic Party launched a murderous campaign of racist terrorism to regain political power throughout the South.
From 1890 to 1908, they proceeded to pass legislation that disenfranchised most African Americans and many poor whites, trapping them without representation. They established white supremacist regimes of Jim Crow segregation in the South and one-party block voting behind Southern Democrats.
Democratic Party politicians (many having been former slaveowners and political and military leaders of the Confederacy) conspired to deny African Americans their exercise of civil and political rights by terrorizing black communities with lynch mobs and other forms of vigilante violence as well as by instituting a convict labor system that forced many thousands of African Americans back into unpaid labor in mines, plantations and on public works projects such as roads and levees.
Convict laborers were typically subject to brutal forms of corporal punishment, overwork and disease from unsanitary conditions. Death rates were extraordinarily high. While a small number of African Americans were able to acquire land shortly after the Civil War, most were exploited as sharecroppers.
Whether sharecropping or on their own acreage, most of the black population was closely financially dependent on agriculture. This added another impetus for the Migration: The arrival of the boll weevil. The beetle eventually came to waste 8% of the country's cotton yield annually and thus disproportionately impacted this part of America's citizenry.
As life in the South became increasingly difficult, African Americans began to migrate north in great numbers.
Most of the future leading lights of what was to become known as the "Harlem Renaissance" movement arose from a generation that had memories of the gains and losses of Reconstruction after the Civil War. Sometimes their parents, grandparents – or they themselves – had been slaves. Their ancestors had sometimes benefited by paternal investment in cultural capital, including better-than-average education.
Many in the Harlem Renaissance were part of the early 20th century Great Migration out of the South into the African American neighborhoods of the Northeast and Midwest. African Americans sought a better standard of living and relief from the institutionalized racism in the South.
Others were people of African descent from racially stratified communities in the Caribbean who came to the United States hoping for a better life. Uniting most of them was their convergence in Harlem.
DevelopmentDuration: 15 minutes and 11 seconds.15:11
During the early portion of the 20th century, Harlem was the destination for migrants from around the country, attracting both people from the South seeking work and an educated class who made the area a center of culture, as well as a growing "Negro" middle class.
These people were looking for a fresh start in life and this was a good place to go. The district had originally been developed in the 19th century as an exclusive suburb for the white middle and upper middle classes; its affluent beginnings led to the development of stately houses, grand avenues, and world-class amenities such as the Polo Grounds and the Harlem Opera House.
During the enormous influx of European immigrants in the late 19th century, the once exclusive district was abandoned by the white middle class, who moved farther north.
Harlem became an African American neighborhood in the early 1900s. In 1910, a large block along 135th Street and Fifth Avenue was bought by various African American realtors and a church group.
Many more African Americans arrived during the First World War. Due to the war, the migration of laborers from Europe virtually ceased, while the war effort resulted in a massive demand for unskilled industrial labor. The Great Migration brought hundreds of thousands of African Americans to cities such as Chicago, Philadelphia, Detroit, Washington, D.C., and New York.
Despite the increasing popularity of Negro culture, virulent white racism, often by more recent ethnic immigrants, continued to affect African American communities, even in the North.
After the end of World War I, many African American soldiers—who fought in segregated units such as the Harlem Hellfighters—came home to a nation whose citizens often did not respect their accomplishments. Race riots and other civil uprisings occurred throughout the United States during the Red Summer of 1919, reflecting economic competition over jobs and housing in many cities, as well as tensions over social territories.
Mainstream recognition of Harlem culture:
The first stage of the Harlem Renaissance started in the late 1910s. In 1917, the premiere of Granny Maumee, The Rider of Dreams, and Simon the Cyrenian: Plays for a Negro Theater took place.
These plays, written by white playwright Ridgely Torrence, featured African American actors conveying complex human emotions and yearnings. They rejected the stereotypes of the blackface and minstrel show traditions. In 1917, James Weldon Johnson called the premieres of these plays "the most important single event in the entire history of the Negro in the American Theater".
Another landmark came in 1919, when the communist poet Claude McKay published his militant sonnet "If We Must Die", which introduced a dramatically political dimension to the themes of African cultural inheritance and modern urban experience featured in his 1917 poems "Invocation" and "Harlem Dancer".
Published under the pseudonym Eli Edwards, these were his first appearance in print in the United States after immigrating from Jamaica. Although "If We Must Die" never alluded to race, African American readers heard its note of defiance in the face of racism and the nationwide race riots and lynchings then taking place.
By the end of the First World War, the fiction of James Weldon Johnson and the poetry of Claude McKay were describing the reality of contemporary African American life in America.
The Harlem Renaissance grew out of the changes that had taken place in the African American community since the abolition of slavery, as the expansion of communities in the North.
These accelerated as a consequence of World War I and the great social and cultural changes in the early 20th-century United States. Industrialization attracted people from rural areas to cities and gave rise to a new mass culture. Contributing factors leading to the Harlem Renaissance were the Great Migration of African Americans to Northern cities, which concentrated ambitious people in places where they could encourage each other, and the First World War, which had created new industrial work opportunities for tens of thousands of people.
Factors leading to the decline of this era include the Great Depression.
Literature:
In 1917, Hubert Harrison, "The Father of Harlem Radicalism", founded the Liberty League and The Voice, the first organization and the first newspaper, respectively, of the "New Negro Movement". Harrison's organization and newspaper were political but also emphasized the arts (his newspaper had "Poetry for the People" and book review sections).
In 1927, in the Pittsburgh Courier, Harrison challenged the notion of the Renaissance. He argued that the "Negro Literary Renaissance" notion overlooked "the stream of literary and artistic products which had flowed uninterruptedly from Negro writers from 1850 to the present," and said the so-called "Renaissance" was largely a white invention.
Alternatively, a writer like the Chicago-based author, Fenton Johnson, who began publishing in the early 1900s, is called a "forerunner" of the Harlem Renaissance, "one of the first negro revolutionary poets"
Nevertheless, with the Harlem Renaissance came a sense of acceptance for African American writers; as Langston Hughes put it, with Harlem came the courage "to express our individual dark-skinned selves without fear or shame."
Alain Locke's anthology The New Negro was considered the cornerstone of this cultural revolution. The anthology featured several African American writers and poets, from the well-known, such as Zora Neale Hurston and communists Langston Hughes and Claude McKay, to the lesser known, like the poet Anne Spencer.
Many poets of the Harlem Renaissance were inspired to tie threads of African American culture into their poems; as a result, jazz poetry was heavily developed during this time. "The Weary Blues" was a notable jazz poem written by Langston Hughes.
Through their works of literature, black authors were able to give a voice to the African American identity, and strived for a community of support and acceptance.
Religion:
Christianity played a major role in the Harlem Renaissance. Many of the writers and social critics discussed the role of Christianity in African American lives. For example, a famous poem by ub, "Madam and the Minister", reflects the temperature and mood towards religion in the Harlem Renaissance.
The cover story for The Crisis magazine's publication in May 1936 explains how important Christianity was regarding the proposed union of the three largest Methodist churches of 1936. This article shows the controversial question of unification for these churches.
The article "The Catholic Church and the Negro Priest", also published in The Crisis, January 1920, demonstrates the obstacles that African American priests faced in the Catholic Church.
The article confronts what it saw as policies based on race that excluded African Americans from higher positions in the Church.
Discourse:
Various forms of religious worship existed during this time of African American intellectual reawakening.
Although there were racist attitudes within the current Abrahamic religious arenas, many African Americans continued to push towards the practice of a more inclusive doctrine. For example, George Joseph MacWilliam presents various experiences of rejection on the basis of his color and race during his pursuit towards priesthood, yet he shares his frustration in attempts to incite action on the part of The Crisis magazine community.
There were other forms of spiritualism practiced among African Americans during the Harlem Renaissance. Some of these religions and philosophies were inherited from African ancestry. For example, the religion of Islam was present in Africa as early as the 8th century through the Trans-Saharan trade. Islam came to Harlem likely through the migration of members of the Moorish Science Temple of America, which was established in 1913 in New Jersey.
Various forms of Judaism were practiced, including Orthodox, Conservative and Reform Judaism, but it was Black Hebrew Israelites that founded their religious belief system during the early 20th century in the Harlem Renaissance.
Traditional forms of religion acquired from various parts of Africa were inherited and practiced during this era. Some common examples were Voodoo and Santeria.
Criticism:
Religious critique during this era was found in music, literature, art, theater and poetry. The Harlem Renaissance encouraged analytic dialogue that included the open critique and the adjustment of current religious ideas.
One of the major contributors to the discussion of African American renaissance culture was Aaron Douglas, who, with his artwork, also reflected the revisions African Americans were making to the Christian dogma. Douglas uses biblical imagery as inspiration to various pieces of artwork, but with the rebellious twist of an African influence.
Countee Cullen's poem "Heritage" expresses the inner struggle of an African American between his past African heritage and the new Christian culture A more severe criticism of the Christian religion can be found in Langston Hughes's poem "Merry Christmas", where he exposes the irony of religion as a symbol for good and yet a force for oppression and injustice.
Music:
A new way of playing the piano called the Harlem Stride style was created during the Harlem Renaissance helping to blur the lines between the poor African Americans and socially elite African Americans.
The traditional jazz band was composed primarily of brass instruments and was considered a symbol of the South, but the piano was considered an instrument of the wealthy. With this instrumental modification to the existing genre, the wealthy African Americans now had more access to jazz music. Its popularity soon spread throughout the country and was consequently at an all-time high.
Innovation and liveliness were important characteristics of performers in the beginnings of jazz.
Jazz performers and composers at the time such as:
- Eubie Blake,
- Noble Sissle,
- Jelly Roll Morton,
- Luckey Roberts,
- James P. Johnson,
- Willie "The Lion" Smith,
- Andy Razaf,
- Fats Waller,
- Ethel Waters,
- Adelaide Hall,[36]
- Florence Mills
- and bandleaders:
The above were extremely talented, skillful, competitive and inspirational. They laid great parts of the foundations for future musicians of their genre.
Duke Ellington gained popularity during the Harlem Renaissance. According to Charles Garrett, "The resulting portrait of Ellington reveals him to be not only the gifted composer, bandleader, and musician we have come to know, but also an earthly person with basic desires, weaknesses, and eccentricities."
Ellington did not let his popularity get to him. He remained calm and focused on his music.
During this period, the musical style of blacks was becoming more and more attractive to whites. White novelists, dramatists and composers started to exploit the musical tendencies and themes of African Americans in their works.
Composers (including William Grant Still, William L. Dawson and Florence Price) used poems written by African American poets in their songs, and would implement the rhythms, harmonies and melodies of African American music—such as blues, spirituals and jazz—into their concert pieces.
African Americans began to merge with whites into the classical world of musical composition.
The first African American male to gain wide recognition as a concert artist in both his region and internationally was Roland Hayes. He trained with Arthur Calhoun in Chattanooga, and at Fisk University in Nashville. Later, he studied with Arthur Hubbard in Boston and with George Henschel and Amanda Ira Aldridge in London, England. Hayes began singing in public as a student, and he toured with the Fisk Jubilee Singers in 1911.
Musical theatre:
According to James Vernon Hatch and Leo Hamalian, all-black review, Run, Little Chillun, is considered one of the most successful musical dramas of the Harlem Renaissance.
Fashion:
During the Harlem Renaissance, the African American clothing scene took a dramatic turn from the prim and proper many young women preferred, from short skirts and silk stockings to drop-waisted dresses and cloche hats.
Women wore loose-fitted garments and accessorized with long strand pearl bead necklaces, feather boas, and cigarette holders. The fashion of the Harlem Renaissance was used to convey elegance and flamboyancy and needed to be created with the vibrant dance style of the 1920s in mind.
Popular by the 1930s was a trendy, egret-trimmed beret.
Men wore loose suits that led to the later style known as the "Zoot", which consisted of wide-legged, high-waisted, peg-top trousers, and a long coat with padded shoulders and wide lapels.
Men also wore wide-brimmed hats, colored socks, white gloves and velvet-collared Chesterfield coats. During this period, African Americans expressed respect for their heritage through a fad for leopard-skin coats, indicating the power of the African animal.
While performing in Paris during the height of the Renaissance, the extraordinarily successful black dancer Josephine Baker was a major fashion trendsetter for black and white women alike. Her gowns from the couturier Jean Patou were copied, especially her stage costumes, which Vogue magazine called "startling".
Josephine Baker is also credited for highlighting the "art deco" fashion era after she performed the "Danse Sauvage". During this Paris performance, she adorned a skirt made of string and artificial bananas. Ethel Moses was another popular black performer. Moses starred in silent films in the 1920s and 1930s and was recognizable by her signature bob hairstyle.
Photography:
James Van Der Zee's photography played an important role in shaping and documenting the cultural and social life of Harlem during the Harlem Renaissance. His photographs were instrumental in shaping the image and identity of the African American community during the Harlem Renaissance. His work documented the achievements of cultural figures and helped to challenge stereotypes and racist attitudes which in turn promoted pride and dignity among African Americans in Harlem and beyond.
Van Der Zee's studio was not just a place for taking photographs; it was also a social and cultural hub for Harlem residents. People would come to his studio not only to have their portraits taken, but also to socialize and to participate in the community events that he hosted. Van Der Zee's studio played an important role in the cultural life of Harlem during the early 20th century, and helped to foster a sense of community and pride among its residents.
Some notable persons photographed are Marcus Garvey, the leader of the Universal Negro Improvement Association (UNIA), a black nationalist organization that promoted Pan-Africanism and economic independence for African Americans.
Other notable black persons he photographed are:
- Countee Cullen, a poet and writer who was associated with the Harlem Renaissance;
- Josephine Baker, a dancer and entertainer who became famous in France and was known for her provocative performances;
- W.E.B. Du Bois, a sociologist, historian and civil rights activist who was a leading figure in the African American community in the early 20th century;
- Langston Hughes, a poet, novelist and playwright who was one of the most important writers of the Harlem Renaissance;
- and Madam C.J. Walker, an entrepreneur and philanthropist who was one of the first African American women to become a self-made millionaire, as well as her daughter, Dorthy Waring, an artist and author of 12 novels.
Van Der Zee's work gained renewed attention in the 1960s and 1970s, when interest in the Harlem Renaissance was revived. Van Der Zee's photographs have been featured in numerous exhibitions over the years.
One notable exhibition was "Harlem on My Mind: Cultural Capital of Black America, 1900-1968," which was organized by the Metropolitan Museum of Art in 1969. The exhibit included over 300 photographs, many of which were by Van Der Zee, and was one of the first major exhibitions to focus on the cultural achievements of African Americans in Harlem.
Van Der Zee's work was the eyes of Harlem. His photographs are recognized as important documents of African American life and culture during the early 20th century.
They serve as a visual record of the achievements of the Harlem Renaissance. His portraits of writers, musicians, artists and other cultural figures helped to promote their work and bring attention to the vibrant creative scene known as Harlem.
Characteristics and themes:
Characterizing the Harlem Renaissance was an overt racial pride that came to be represented in the idea of the New Negro, who through intellect and production of literature, art and music could challenge the pervading racism and stereotypes to promote progressive or socialist politics, and racial and social integration.
The creation of art and literature would serve to "uplift" the race.
There would be no uniting form singularly characterizing the art that emerged from the Harlem Renaissance. Rather, it encompassed a wide variety of cultural elements and styles, including a Pan-African perspective, "high-culture" and "low-culture" or "low-life", from the traditional form of music to the blues and jazz, traditional and new experimental forms in literature such as modernism and the new form of jazz poetry.
This duality meant that numerous African American artists came into conflict with conservatives in the black intelligentsia, who took issue with certain depictions of black life.
Some common themes represented during the Harlem Renaissance were the influence of the experience of slavery and emerging African American folk traditions on black identity, the effects of institutional racism, the dilemmas inherent in performing and writing for elite white audiences, and the question of how to convey the experience of modern black life in the urban North.
The Harlem Renaissance was one of primarily African American involvement. It rested on a support system of black patrons and black-owned businesses and publications. However, it also depended on the patronage of white Americans, such as Carl Van Vechten and Charlotte Osgood Mason, who provided various forms of assistance, opening doors which otherwise might have remained closed to the publication of work outside the black American community.
This support often took the form of patronage or publication. Carl Van Vechten was one of the most noteworthy white Americans involved with the Harlem Renaissance. He allowed for assistance to the black American community because he wanted racial sameness.
There were other whites interested in so-called "primitive" cultures, as many whites viewed black American culture at that time, and wanted to see such "primitivism" in the work coming out of the Harlem Renaissance. As with most fads, some people may have been exploited in the rush for publicity.
Interest in African American lives also generated experimental but lasting collaborative work, such as the all-black productions of George Gershwin's opera Porgy and Bess, and Virgil Thomson and Gertrude Stein's Four Saints in Three Acts.
In both productions the choral conductor Eva Jessye was part of the creative team. Her choir was featured in Four Saints. The music world also found white band leaders defying racist attitudes to include the best and the brightest African-American stars of music and song in their productions.
The African Americans used art to prove their humanity and demand for equality. The Harlem Renaissance led to more opportunities for blacks to be published by mainstream houses. Many authors began to publish novels, magazines and newspapers during this time.
The new fiction attracted a great amount of attention from the nation at large. Among authors who became nationally known were:
- Jean Toomer,
- Jessie Fauset,
- Claude McKay,
- Zora Neale Hurston,
- James Weldon Johnson,
- Alain Locke,
- Omar Al Amiri,
- Eric D. Walrond
- and Langston Hughes.
Richard Bruce Nugent (1906–1987), who wrote "Smoke, Lilies, and Jade", made an important contribution, especially in relation to experimental form and LGBT themes in the period.
The Harlem Renaissance helped lay the foundation for the post-World War II protest movement of the Civil Rights movement. Moreover, many black artists who rose to creative maturity afterward were inspired by this literary movement.
The Renaissance was more than a literary or artistic movement, as it possessed a certain sociological development—particularly through a new racial consciousness—through ethnic pride, as seen in the Back to Africa movement led by Jamaican Marcus Garvey.
At the same time, a different expression of ethnic pride, promoted by W. E. B. Du Bois, introduced the notion of the "talented tenth".
Du Bois wrote of the Talented Tenth:"The Negro race, like all races, is going to be saved by its exceptional men. The problem of education, then, among Negroes must first of all deal with the Talented Tenth; it is the problem of developing the best of this race that they may guide the mass away from the contamination and death of the worst."
These "talented tenth" were considered the finest examples of the worth of black Americans as a response to the rampant racism of the period. No particular leadership was assigned to the talented tenth, but they were to be emulated.
In both literature and popular discussion, complex ideas such as Du Bois's concept of "twoness" (dualism) were introduced (see The Souls of Black Folk; 1903). Du Bois explored a divided awareness of one's identity that was a unique critique of the social ramifications of racial consciousness. This exploration was later revived during the Black Pride movement of the early 1970s.
Influence:
A new black identity:
The Harlem Renaissance was successful in that it brought the black experience clearly within the corpus of American cultural history. Not only through an explosion of culture, but on a sociological level, the legacy of the Harlem Renaissance redefined how America, and the world, viewed African Americans.
The migration of Southern blacks to the North changed the image of the African American from rural, undereducated peasants to one of urban, cosmopolitan sophistication. This new identity led to a greater social consciousness, and African Americans became players on the world stage, expanding intellectual and social contacts internationally.
The progress—both symbolic and real—during this period became a point of reference from which the African American community gained a spirit of self-determination that provided a growing sense of both black urbanity and black militancy, as well as a foundation for the community to build upon for the Civil Rights struggles in the 1950s and 1960s.
The urban setting of rapidly developing Harlem provided a venue for African Americans of all backgrounds to appreciate the variety of black life and culture. Through this expression, the Harlem Renaissance encouraged the new appreciation of folk roots and culture.
For instance, folk materials and spirituals provided a rich source for the artistic and intellectual imagination, which freed blacks from the establishment of past condition.
Through sharing in these cultural experiences, a consciousness sprung forth in the form of a united racial identity.
However, there was some pressure within certain groups of the Harlem Renaissance to adopt sentiments of conservative white America in order to be taken seriously by the mainstream.
The result being that queer culture, while far-more accepted in Harlem than most places in the country at the time, was most fully lived out in the smoky dark lights of bars, nightclubs and cabarets in the city. It was within these venues that the blues music scene boomed, and, since it had not yet gained recognition within popular culture, queer artists used it as a way to express themselves honestly.
Even though there were factions within the Renaissance that were accepting of queer culture/lifestyles, one could still be arrested for engaging in homosexual acts. Many people, including author Alice Dunbar Nelson and "The Mother of Blues" Gertrude "Ma" Rainey, had husbands but were romantically linked to other women as well.
Women and the LGBTQ community:
During the Harlem Renaissance, various well-known figures, including Claude Mckay, Langston Hughes, and Ethel Waters, are believed to have had private same-gender relationships, although this aspect of their lives remained undisclosed to the public during that era.
In the Harlem music scene, places such as the Cotton Club and Rockland Palace routinely held gay drag shows in addition to straight performances. Lesbian or bisexual women performers, such as blues singers Gladys Bentley and Bessie Smith, were a part of this cultural movement, which contributed to a renewed interest in African American culture among the black community and introduced it to a wider audience.
Although women's contributions to culture were often overlooked at the time, contemporary black feminist critics have endeavored to re-evaluate and recognize the cultural production of women during the Harlem Renaissance.
Authors such as Nella Larsen and Jessie Fauset have gained renewed critical acclaim for their work from modern perspectives.
Blues singer Gertrude "Ma" Rainey was known to dress in traditionally male clothing, and her blues lyrics often reflected her sexual proclivities for women, which was extremely radical at the time. Ma Rainey was also the first person to introduce blues music into vaudeville.
Rainey's protégé, Bessie Smith, was another artist who used the blues as a way to express unapologetic views on same-gender relations, with such lines as "When you see two women walking hand in hand, just look em' over and try to understand: They'll go to those parties – have the lights down low – only those parties where women can go."
Rainey, Smith, and artist Lucille Bogan were collectively known as "The Big Three of the Blues." Another prominent blues singer was Gladys Bentley, who was known to cross-dress. Bentley was the club owner of Clam House on 133rd Street in Harlem, which was a hub for queer patrons. The Hamilton Lodge in Harlem hosted an annual drag ball, drawing thousands of people to watch young men dance in drag.
Though there were safe spaces within Harlem, there were prominent voices, such as that of Abyssinian Baptist Church's minister Adam Clayton Powell Sr., who actively opposed homosexuality.
The Harlem Renaissance was instrumental in fostering the "New Negro" movement, an endeavor by African Americans to redefine their identity free from degrading stereotypes. The Neo-New Negro movement further challenged racial definitions, stereotypes, and gender norms and roles, seeking to address normative sexuality and sexism in American society.
These ideas received some pushback, particularly regarding sexual freedom for women, which was seen as confirming the stereotype that black women were sexually uninhibited.
Some members of the black bourgeoisie saw this as hindering the overall progress of the black community and fueling racist sentiments. Yet queer culture and artists defined major portions of the Harlem Renaissance; Henry Louis Gates Jr., the author of "The Black Man's Burden", wrote that the Harlem Renaissance "was surely as gay as it was black".
Criticism of the movement:
Many critics point out that the Harlem Renaissance could not escape its history and culture in its attempt to create a new one, or sufficiently separate from the foundational elements of white, European culture.
Often Harlem intellectuals, while proclaiming a new racial consciousness, resorted to mimicry of their white counterparts by adopting their clothing, sophisticated manners and etiquette. This "mimicry" may also be called assimilation, as that is typically what minority members of any social construct must do in order to fit social norms created by that construct's majority.
This could be seen as a reason that the artistic and cultural products of the Harlem Renaissance did not overcome the presence of white-American values and did not reject these values. In this regard, the creation of the "New Negro", as the Harlem intellectuals sought, was considered a success.
The Harlem Renaissance appealed to a mixed audience. The literature appealed to the African American middle class and to whites. Magazines such as The Crisis, a monthly journal of the NAACP, and Opportunity, an official publication of the National Urban League, employed Harlem Renaissance writers on their editorial staffs, published poetry and short stories by black writers, and promoted African American literature through articles, reviews and annual literary prizes.
However, as important as these literary outlets were, the Renaissance relied heavily on white publishing houses and white-owned magazines.
A major accomplishment of the Renaissance was to open the door to mainstream white periodicals and publishing houses, although the relationship between the Renaissance writers and white publishers and audiences created some controversy.
W. E. B. Du Bois did not oppose the relationship between black writers and white publishers, but he was critical of works such as Claude McKay's bestselling novel Home to Harlem (1928) for appealing to the "prurient demand[s]" of white readers and publishers for portrayals of black "licentiousness".
Langston Hughes spoke for most of the writers and artists when he wrote in his essay "The Negro Artist and the Racial Mountain" (1926) that black artists intended to express themselves freely, no matter what the black public or white public thought.
Hughes in his writings also returned to the theme of racial passing, but, during the Harlem Renaissance, he began to explore the topic of homosexuality and homophobia. He began to use disruptive language in his writings. He explored this topic because it was a theme that during this time period was not discussed.
African American musicians and writers were among mixed audiences as well, having experienced positive and negative outcomes throughout the New Negro Movement. For musicians, Harlem, New York's cabarets and nightclubs shined a light on black performers and allowed for black residents to enjoy music and dancing.
However, some of the most popular clubs (that showcased black musicians) were exclusively for white audiences; one of the most famous white-only nightclubs in Harlem was the Cotton Club, where popular black musicians like Duke Ellington frequently performed. Ultimately, the black musicians who appeared at these white-only clubs became far more successful and became a part of the mainstream music scene.
Similarly, black writers were given the opportunity to shine once the New Negro Movement gained traction as short stories, novels and poems by black authors began taking form and getting into various print publications in the 1910s and 1920s. Although a seemingly good way to establish their identities and culture, many authors note how hard it was for any of their work to actually go anywhere.
Writer Charles Chesnutt in 1877, for example, notes that there was no indication of his race alongside his publication in Atlantic Monthly (at the publisher's request).
A prominent factor in the New Negro's struggle was that their work had been made out to be "different" or "exotic" to white audiences, making a necessity for black writers to appeal to them and compete with each other to get their work out.
Famous black author and poet Langston Hughes explained that black-authored works were placed in a similar fashion to those of oriental or foreign origin, only being used occasionally in comparison to their white-made counterparts: Once a spot for a black work was "taken", black authors had to look elsewhere to publish.
Certain aspects of the Harlem Renaissance were accepted without debate, and without scrutiny. One of these was the future of the "New Negro". Artists and intellectuals of the Harlem Renaissance echoed American progressivism in its faith in democratic reform, in its belief in art and literature as agents of change, and in its almost uncritical belief in itself and its future.
This progressivist worldview rendered black intellectuals—just like their white counterparts—unprepared for the rude shock of the Great Depression, and the Harlem Renaissance ended abruptly because of naïve assumptions about the centrality of culture, unrelated to economic and social realities.
Works associated with the Harlem Renaissance
- Blackbirds of 1928
- Encyclopedia of the Harlem Renaissance (book)
- The New Negro: The Life of Alain Locke
- Shuffle Along, musical
- Untitled (The Birth), painting
- Voodoo (opera)
- When Washington Was in Vogue
- The Negro in Art
- Taboo (1922 play)
- There'll Be Some Changes Made
See Also:
- Black Arts Movement, 1960s and 1970s
- Black Renaissance in D.C.
- Chicago Black Renaissance
- List of female entertainers of the Harlem Renaissance
- List of figures from the Harlem Renaissance
- New Negro
- Niggerati
- William E. Harmon Foundation award
- Cotton Club, nightclub
- Roaring Twenties
- African-American art
- African-American culture
- African-American literature
- List of African-American visual artists
- "A Guide to Harlem Renaissance Materials", from the Library of Congress
- Bryan Carter (ed.). "Virtual Harlem". University of Illinois at Chicago, Electronic Visualization Laboratory.
- Underneath A Harlem Moon by Iain Cameron Williams
- I'd Like to Show You Harlem – by Rollin Lynde Hartt, The Independent, April, 1921
- Collection: "Artists of the Harlem Renaissance" from the University of Michigan Museum of Art
- The Harlem Renaissance and Transatlantic Modernism Exhibit at the Metropolitan Museum of Art, February 25 – July 28, 2024.
- Articles in The New York Times on the Harlem Renaissance, including on the 2024 exhibit at the Metropolitan Museum of Art
Diversity, Equity and Inclusion
- YouTube Video: What is DEI (Diversity, Equity, & Inclusion) and Why Does It Matter In The Workplace?
- YouTube Video: TEDxHampshireCollege - Jay Smooth - How I Learned to Stop Worrying and Love Discussing Race
- YouTube Video: Answering the Attack: Why Diversity, Equity, and Inclusion (DEI) Helps Everyone
Diversity, equity, and inclusion (DEI) are organizational frameworks which seek to promote the fair treatment and full participation of all people, particularly groups who have historically been underrepresented or subject to discrimination on the basis of identity or disability.
These three notions (diversity, equity, and inclusion) together represent "three closely linked values" which organizations seek to institutionalize through DEI frameworks. Some experts say diversity and inclusion should be decoupled in some cases. Some frameworks, primarily in Britain, substitute the notion of "equity" with equality: equality, diversity, inclusion (EDI).
Other variations include:
Diversity refers to the presence of variety within the organizational workforce, such as in identity and identity politics. It includes:
Equity refers to concepts of fairness and justice, such as fair compensation and substantive equality. More specifically, equity usually also includes a focus on societal disparities and allocating resources and "decision making authority to groups that have historically been disadvantaged", and taking "into consideration a person's unique circumstances, adjusting treatment accordingly so that the end result is equal."
Finally, inclusion refers to creating an organizational culture that creates an experience where "all employees feel their voices will be heard", and a sense of belonging and integration.
DEI is most often used to describe certain "training" efforts, such as diversity training. Though DEI is best known as a form of corporate training, it also finds implementation within many types of organizations, such as within academia, schools, and hospitals.
In recent years, DEI efforts and policies have generated criticism, some directed at the specific effectiveness of its tools, such as diversity training, its effect on free speech and academic freedom, as well as more broadly attracting criticism on political or philosophical grounds.
History:
DEI policy emerged from Affirmative Action in the United States. The legal term "affirmative action" was first used in "Executive Order No. 10925", signed by President John F. Kennedy on 6 March 1961, which included a provision that government contractors "take affirmative action to ensure that applicants are employed, and employees are treated [fairly] during employment, without regard to their race, creed, color, or national origin". It was used to promote actions that achieve non-discrimination.
In September 1965, President Lyndon Johnson issued Executive Order 11246 which required government employers to "hire without regard to race, religion and national origin" and "take affirmative action to ensure that applicants are employed and that employees are treated during employment, without regard to their race, color, religion, sex or national origin."
The Civil Rights Act of 1964 prohibited discrimination on the basis of race, color, religion, sex or national origin. Neither executive order nor The Civil Rights Act authorized group preferences. The Senate floor manager of the bill, Senator Hubert Humphrey, declared that the bill “would prohibit preferential treatment for any particular group” adding “I will eat my hat if this leads to racial quotas.”
However affirmative action in practice would eventually become synonymous with preferences, goals and quotas as upheld or struck down by Supreme Court decisions even though no law had been passed explicitly permitting discrimination in favor of disadvantaged groups. Some state laws explicitly banned racial preferences, and in response some laws have failed attempting to explicitly legalize race preferences.
Affirmative action is intended to alleviate under-representation and to promote the opportunities of defined minority groups within a society to give them equal access to that of the majority population.
The philosophical basis of the policy has various rationales, including but not limited to compensation for past discrimination, correction of current discrimination, and the diversification of society. It is often implemented in governmental and educational settings to ensure that designated groups within a society can participate in all promotional, educational, and training opportunities.
The stated justification for affirmative action by its proponents is to help compensate for past discrimination, persecution or exploitation by the ruling class of a culture, and to address existing discrimination. More recently concepts have moved beyond discrimination to include diversity, equity and inclusion as motives for preferring historically underrepresented groups.
In the famous Bakke decision of 1978, Regents of the University of California v. Bakke, diversity now became a factor in constitutional law. The Supreme Court ruled quotas were illegal but it was allowable to consider race as a plus factor when trying to foster "diversity" in their classes.
Diversity themes gained momentum in the mid-1980s. At a time when President Ronald Reagan threatened to dismantle equality and affirmative action laws in the 1980s, equality and affirmative action professionals employed by US firms along with equality consultants, engaged in establishing the argument that a diverse workforce should be seen as a competitive advantage rather than just as a legal constraint.
Basically, their message was, do not promote diversity because it is a legal mandate, but because it is good for business . From then on, researchers started to test a number of hypotheses on the business benefits of diversity and of diversity management, known as the business case of diversity.
In 2003 corporations spent $8 billion annually on diversity. After the election of Donald Trump in 2016 and the ascent of the #MeToo and Black Lives Matter movements, Time magazine stated in 2019 that the DEI industry had "exploded" in size.
Within academia, a 2019 survey found that spending on DEI efforts had increased 27 percent over the five preceding academic years.
One 2020 estimate placed the size of the global diversity and inclusion market at $7.5 billion, of which $3.4 billion was in the United States, projecting it to reach $17.2 billion by 2027.
In 2021, New York magazine stated "the business became astronomically larger than ever" after the murder of George Floyd in May 2020. The Economist has also stated that surveys of international companies indicate that the number of people hired for jobs with "diversity" or "inclusion" in the title more than quadrupled since 2010.
As of 2024, affirmative action rhetoric has been increasingly replaced by emphasis on diversity, equity, and inclusion, while nine states explicitly ban its use in the employment process.
The Supreme Court in 2023 explicitly rejected affirmative action regarding race in college admissions in Students for Fair Admissions v. Harvard. The Court held that affirmative action programs "lack sufficiently focused and measurable objectives warranting the use of race, unavoidably employ race in a negative manner, involve racial stereotyping, and lack meaningful end points. We have never permitted admissions programs to work in that way, and we will not do so today".
Methods and arguments:
Further information:
In a 2018 article, proponents of DEI argued that because businesses and corporations exist within a larger world, they cannot be completely separated from the issues that exist in society. Therefore, the authors argue the need for DEI to improve coworker relations and teamwork.
Through a DEI plan, organizations outline measures to be taken, including recruiting and retaining personnel, fostering effective communication channels, imparting relevant training, and regulating workplace conduct.
As of 2022 many academic institutions in the US have also started making commitments to DEI in different ways, including creating documents, programs and appointing dedicated staff members, especially in the US.
Many accreditation agencies now require supporting DEI. As of 2014, information on DEI for both students and professors was widespread in colleges and universities, with many schools requiring training and meetings on the topic. Many scholarships and opportunities at universities even have a secondary purpose of encouraging diversity.
Diversity in higher education can be difficult, with diverse students often feeling reduced to fulfilling a 'diversity quota,' which can carry a high emotional tax.
Within healthcare, DEI reflective groups have been used to enhance the cultural sensitivity within mental health professionals. Such reflective spaces help improve mental health professionals reflexivity and awareness of DEI-related issues both within direct clinical work with clients, their families, and wider systems, as well as within professional supervision and teams.
DEI positions also exist with the goal of creating allies for public school students through resources and staff training, in order to support students facing social disparities. Other proponents of allyship consider impromptu speaking a key skill for allies to operate on authenticity in everyday words and reactions.
Corporate:
Main article: Diversity (business)
Diversity management as a concept appeared and gained momentum in the US in the mid-1980s. At a time when President Ronald Reagan threatened to dismantle equality and affirmative action laws in the US in the 1980s, equality and affirmative action professionals employed by US firms along with equality consultants, engaged in establishing the argument that a diverse workforce should be seen as a competitive advantage rather than just as a legal constraint.
Basically, their message was, do not promote diversity because it is a legal mandate, but because it is good for business (Kelly and Dobbin, 1998).
Following the murder of George Floyd in 2020, some companies made substantial commitments to racial equity by establishing dedicated diversity, equity, and inclusion teams. In early 2024 the Washington Post reported that there is a trend in corporate America to reduce DEI positions and delegate the work to external consultants.
The number of DEI jobs reached its highest point in early 2023, but subsequently decreased by 5 percent that year and has further shrunk by 8 percent in 2024. The attrition rate for DEI roles has been approximately twice as high as that of non-DEI positions.
The scaling back of DEI initiatives has aligned with a rise in legal challenges and political opposition to systematic endeavors aimed at enhancing racial equity.
Diversity management can be seen to "leverage organisational diversity to enhance organisational justice and achieve better business outcomes".
Several reports including by McKinsey & Company claim financial benefits of DEI, but are criticized for over-generalization, lack of rigour, and distinguishing between correlation and causality.
At an aggregate level, Alessina, Harnoss, and Rapoport (2013) have shown that birth country diversity of the labor force positively impacts a nation's long term productivity and income.
Firm-level research has provided conditional support to the proposal that workforce diversity per se brings business benefits with it. In short, whether diversity pays off or not depends on environmental factors, internal or external to the firm. Dwyer, Richard & Chadwyck (2003) found that the effects of gender diversity at the management level are conditional on the firm's strategic orientation, the organizational culture and the multivariate interaction among these variables.
Schäffner, Gebert, Schöler, & Kirch (2006) found that if the firm's culture incorporates the normative assumption or belief that diversity is an opportunity, then age diversity becomes a predictor of team innovativeness, but not otherwise.
Kearney & Gebert (2006) found that diversity in age, nationality, and functional background, have a positive effect on team innovativeness in a high transformational leadership context, but no effect in a low one.
A curvilinear relationship between diversity and performance was identified by Richard, Barnett, Dwyer, & Chadwick (2004).
Kochan, Bezrukova, Ely, Jackson, Joshi, Jehn et al. (2003), found few positive or negative direct effects of diversity on performance. In the cases that came under their scrutiny, a number of different aspects of the organizational context or group processes moderated the diversity-performance relationship.
Failing to manage diversity properly or developing diversity per se leads to only mixed results (Bell & Berry, 2007; Klein & Harrison, 2007), although Risberg & Corvellec show that approaching diversity management in terms of trying is a way to emphasize the performative dimension of diversity management beyond a reductionist dichotomy between success and failure.
Overall research suggests that diversity needs to be properly managed if any business benefits are to be reaped. If properly managed under the right conditions, diversity likely will hold its business promises. Given this conditional nature, the topic remains open to debate and further research.
Research indicates that attempts to promote diversity can provoke defensive responses: One study showed that even incidental allusions to diversity during interviews promoted defensive reactions in White male applicants. Indeed, after diversity was mentioned, their performance during the interview deteriorated and their physiological arousal increased.
Criticism and controversy:
According to The Chronicle of Higher Education, institutions are making defensive adjustments to the criticism. Some schools are removing the word “diversity” from titles of offices and jobs; some are closing campus spaces set up for students according to identity; some are ending diversity training; and some have stopped asking all faculty and staff members for written affirmations of their commitment to diversity.
Diversity training:
Further information: Diversity training
Diversity training, a common tool used in DEI efforts, has repeatedly come under criticism as being ineffective or even counterproductive. The Economist has stated that "the consensus now emerging among academics is that many anti-discrimination policies have no effect. What is worse, they often backfire".
A regular claim is that these efforts mainly work to protect against litigation. It has also been criticized that there has been limited progress in achieving racial diversity in corporate leadership, particularly for Black professionals, due to a lack of diverse Chief Diversity Officers and a broad DEI focus that overlooks specific issues Black professionals face.
A 2007 study of 829 companies over 31 years showed "no positive effects in the average workplace" from diversity training, while the effect was negative where it was mandatory.
According to Harvard University professor in sociology and diversity researcher Frank Dobbin, "[O]n average, the typical all-hands-on-deck, 'everybody has to have diversity training'—that typical format in big companies doesn't have any positive effects on any historically underrepresented groups like black men or women, Hispanic men or women, Asian-American men or women or white women."
Mandatory diversity statements within academia:
The use of mandatory "diversity statements" within academia, wherein an applicant or faculty member outlines their "past contributions" and plans "for advancing diversity, equity and inclusion" if hired, has become controversial and sparked criticism.
The Foundation for Individual Rights and Expression (FIRE) has called such practices an attack on academic freedom, stating that "[v]ague or ideologically motivated DEI statement policies can too easily function as litmus tests for adherence to prevailing ideological views on DEI" and "penalize faculty for holding dissenting opinions on matters of public concern".
According to a 2022 survey conducted by the American Association of University Professors, one in five American colleges and universities include DEI criteria in tenure standards, including 45.6% of institutions with more than 5000 students.
Some universities have begun to weigh diversity statements heavily in hiring processes. For example, University of California, Berkeley eliminated three-quarters of applicants for five faculty positions in the life sciences exclusively on the basis of their diversity statements in the hiring cycle of 2018–2019.
The Academic Freedom Alliance (AFA) has called for the end of required diversity statements, stating it "encourages cynicism and dishonesty" and erases "the distinction between academic expertise and ideological conformity".
Social psychologist Jonathan Haidt, who resigned from the Society for Personality and Social Psychology in protest against mandatory diversity statements, has stated that "most academic work has nothing to do with diversity, so these mandatory statements force many academics to betray their quasi-fiduciary duty to the truth by spinning, twisting, or otherwise inventing some tenuous connection to diversity".
Other criticisms include that it "devalues merit"; is connected to affirmative action; that it violates the First Amendment; or functions as a loyalty oath.
A 1500-person survey conducted by FIRE reported that the issue is highly polarizing for faculty members, with half saying their view more closely aligns with the description of diversity statements as "a justifiable requirement for a job at a university", while the other half saw it as "an ideological litmus test that violates academic freedom".
According to Professor Randall L. Kennedy at Harvard University, "many academics at Harvard and beyond feel intense and growing resentment against the DEI enterprise because of features that are perhaps most evident in the demand for DEI statements", stating "I am a scholar on the left committed to struggles for social justice. The realities surrounding mandatory DEI statements, however, make me wince".
Several U.S. states have implemented legislation to ban mandatory diversity statements. In 2024, MIT announced that diversity statements "will no longer be part of applications for any faculty positions" at the university, becoming the first major university to abandon the practice.
Equity versus equality:
According to DEI frameworks, "equity is different than equality in that equality implies treating everyone as if their experiences are exactly the same." A common identification, especially among critics, is of equality as meaning "equality of opportunities" and equity as "equality of outcome".
This difference between equity and equality is also called Dilemma of Difference. Some have criticized the focus on equity rather than equality, arguing that the former runs contrary to a focus on merit or non-discrimination.
Political scientist Charles Lipson has called "equity" a "mandate to discriminate", threatening the principle of "equality under the law", while Canadian psychologist Jordan Peterson, a frequent critic of DEI, has called equity "the most egregious, self-righteous, historically-ignorant and dangerous" of the three titular notions of DEI.
The debate has also branched into the realm of politics. Commenting on Governor of Texas Greg Abbott calling DEI initiatives "illegal", a spokesperson for his office stated "The issue is not diversity — the issue is that equity is not equality. Here in Texas, we give people a chance to advance based on talent and merit".
Effects of DEI policies on free speech and academic freedom:
In recent years, high-profile incidents of campus conflict have sparked debate about the effect of DEI on the campus environment, academic freedom, and free speech.
The 2021 cancelling of a Massachusetts Institute of Technology (MIT) guest lecture by astrophysicist Dorian Abbot after he criticized DEI programs led to media attention and controversy. As a result, MIT empaneled a committee to investigate the state of academic freedom at the university.
The 2023 disruption of a talk by Fifth Circuit Court of Appeals Judge Kyle Duncan at Stanford Law School sparked criticism and discussion in the media, with many focusing on the role of Associate DEI Dean Tirien Steinbach, who joined protesters in denouncing Duncan's presence on campus.
In the wake of the incident, the editorial board of the Wall Street Journal opined that DEI offices have "become weapons to intimidate and limit speech". Steinbach replied with a piece entitled "Diversity and Free Speech Can Coexist at Stanford" that was published in the Journal the following week.
Dean of Stanford Law School Jenny S. Martínez also published a ten-page document addressing the situation and clarifying Stanford's position on free speech. In it, Martinez stated that the university's commitment to DEI "can and should be implemented in ways that are consistent with its commitment to academic freedom and free speech" and that she believed that "the commitment to diversity, equity, and inclusion actually means that we must protect free expression of all views."
She added that the commitment would not take the form of "having the school administration announce institutional positions on a wide range of current social and political issues, make frequent institutional statements about current news events, or exclude or condemn speakers who hold views on social and political issues with whom some or even many in our community disagree."
She criticized this definition of an "inclusive environment" by stating it "can lead to creating and enforcing an institutional orthodoxy."
In April 2023, a group of 29 scientists, including Nobel laureates Dan Shechtman and Arieh Warshel, published a paper that outlined what the authors see as a "clash in science between classical liberal values" and a "new postmodern worldview", which, they argue, is "enforced by 'Diversity, Equity, and Inclusion' (DEI) officers and bureaucracies" and "threatens the entire scientific enterprise."
Two of the authors, Anna Krylov and Jerry Coyne, subsequently argued in an op-ed in the Wall Street Journal that their emphasis on merit--"once anodyne and unobjectionable [...] now contentious and outré, even in the hard sciences"—led to its refusal by major journals and subsequent publication in the Journal of Controversial Ideas.
The 2023 suicide of former Toronto principal Richard Bilkszto led to a new wave of controversy surrounding DEI in the workplace and its impact on freedom of expression.
Bilkszto had earlier filed a lawsuit against the Toronto District School Board in the wake of a 2021 incident at a DEI training seminar; Bilkszto was later diagnosed with "anxiety secondary to a workplace event", and claimed the session and its aftermath had destroyed his reputation.
Bilkszto's lawyer has publicly linked this incident and its aftermath with his death.
In the wake of Bilkszto's death, Ontario Minister of Education Stephen Lecce stated he had asked for a review and "options to reform professional training and strengthen accountability on school boards so this never happens again", calling Bilkszto's allegations before his death "serious and disturbing".
Bilkszto's death generated international attention and renewed debate on DEI and freedom of speech. According to The Globe and Mail, the incident has also been "seized on by a number of prominent right-wing commentators looking to roll-back [DEI] initiatives."
The anti-racism trainer involved in the incident has stated they welcome the review by Lecce, and stated that the incident has been "weaponized to discredit and suppress the work of people committed [to DEI]".
Antisemitism:
DEI has been accused of ignoring or even contributing to antisemitism. According to Andria Spindel of the Canadian Antisemitism Education Foundation, antisemitism has been largely ignored in the DEI curriculum.
The relationship between DEI and campus antisemitism came under further scrutiny after the October 7, 2023, Hamas attack on Israel and subsequent war in Gaza.
Tabia Lee, a former DEI director at De Anza College in California and DEI critic, has claimed that DEI frameworks foster antisemitism through its "oppressors and the oppressed" dichotomy whereby "Jews are categorically placed in the oppressor category" and described as "white oppressors".
She has claimed that her attempts to include Jews under the DEI umbrella were resisted. When her critics asked the college trustees to oust her from her role, one counselor explicitly referenced her attempts to place Jewish students "on the same footing as marginalized groups".
The Brandeis Center likewise notes how the DEI committee at Stanford University alleged that "Jews, unlike other minority group[s], possess privilege and power, Jews and victims of Jew-hatred do not merit or necessitate the attention of the DEI committee" after two students complained about antisemitic incidents on campus.
Following a wave of antisemitic incidents on American campuses in 2023–2024, several Republican congressmen laid the blame on DEI, with Burgess Owens stating DEI programs "are anything but inclusive for Jews".
DEI's lack of inclusion of Jews and contribution to antisemitism were similarly criticized by businessman Bill Ackman and columnist Heather Mac Donald. Following the antisemitism controversy at the University of Pennsylvania, one donor pulled a $100 million donation "because he thought the school was prioritizing D.E.I. over enhancing the business school's academic excellence."
Politicization and ideology:
DEI has according to some critics become a distinct ideology or "political agenda", leading to a politicization of universities. Fareed Zakaria, a commentator on CNN, has criticized American universities for "[h]aving gone so far down the ideological path" that "these universities and these presidents cannot make the case clearly that at the center of a university is the free expression of ideas." He opines that "[t]he most obvious lack of diversity at universities, political diversity, which clearly affects their ability to analyze many issues, is not addressed."
Dog-whistle diversity:
Author Christine Michel Carter coined the term "dog-whistle diversity" for TIME in 2017. Influenced by the phrase "dog whistle politics," dog whistle diversity is defined as the hiring of groups who have historically been underrepresented or subject to discrimination by organizations for the social aspect of environmental, social, and corporate governance (ESG).
To investors and stakeholders, hiring these groups sends a coded message that the organization is more open to a diverse workforce, but to the groups hired it suggests the organization lacks effective diversity management or inclusion.
Disability community:
According to some critics, DEI initiatives inadvertently sideline disabled people. Writing for The Conversation in 2017, college professor Stephen Friedman said that, "Organizations who are serious about DEI must adopt the frame of producing shared value where business and social goods exist side-by-side".
According to a Time article in 2023, "People with disabilities are being neglected".
This view has been echoed by a number of DEI leaders and activists. Sara Hart Weir, the former president and CEO of the National Down Syndrome Society and co-founder of the Commission for Disability Employment, argues that when deliberating on the vision of DEI success in the United States, policymakers, and employers need to take proactive measures to engaging with people with disabilities who they historically ignored.
Corinne Gray has argued that, "If you embrace diversity, but ignore disability, you're doing it wrong."
Political and public reaction:
Higher education:
Since 2023, Republican-dominated state legislatures are considering bills that are against DEI efforts, primarily at state colleges and universities. The downgrading is taking place amid heavy legal pressures. Supreme Court in June 2023 upended established equal protection law with its decision in Students for Fair Admissions v. Harvard.
This ruling, effectively eliminated the use of affirmative action in college admissions but did not directly affect employers. Nevertheless, since then conservative activists have organized in the states to dismantle race-conscious policies in various aspects of the economy.
The Chronicle of Higher Education in February 2024 is tracking 73 bills introduced in state legislatures in 2023-2024. Of these 8 have become law, 25 failed to pass, and the rest are pending. Two bills became law in Florida and Texas; and one each in North Carolina, North Dakota, Tennessee, and Utah. Florida now prohibits public colleges from requiring “political loyalty tests” as a condition of employment, admission, or promotion.
The other Florida law prohibits public colleges from spending state or federal funds on DEI unless required by federal law. One Texas law prohibits DEI practices or programs, including training, that are not in compliance with the state Constitution regarding equality. The other law bans DEI offices and staff, as well as mandatory diversity training. It also bans identity-based diversity statements that give preference regarding race or sex.
Entertainment and media:
Within the film industry, several prominent actors and directors have criticized recently implemented diversity standards, such as at the Academy Awards. Beginning in 2024, to be eligible for a best-picture nomination at the Academy Awards, a film must meet two of four diversity standards in order to qualify.
Actor Richard Dreyfuss stated the Academy Award's diversity and inclusion standards "make me vomit", arguing that art should not be morally legislated. Several major film directors, who are voting members of the Academy Awards, anonymously expressed their opposition to the new diversity standards to The New York Post, with one describing them as "contrived".
Film critic Armond White attacked the new standards as "progressive fascism", comparing them to the Hays Code.
Conservative media sources, such as National Review, have also been frequent critics of DEI, with contributor George Leff arguing it is authoritarian and anti-meritocratic.
Politics:
In the 2020s, DEI came into the spotlight in American politics, especially in state legislatures in Texas and other Republican-controlled states. Several states are considering or have passed legislation targeting DEI in public institutions.
In March 2023, the Texas House of Representatives passed a bill with a rider banning the use of state funds for DEI programs in universities and colleges. In May 2023, Texas passed legislation banning offices and programs promoting DEI at publicly funded colleges and universities.
In Iowa, a bill to ban spending on DEI in public universities was also advanced in March 2023.
Several prominent Republicans positioned themselves as critics, including Florida Governor Ron DeSantis, Texas Governor Greg Abbott, and 2024 presidential candidate Vivek Ramaswamy.
In January 2024 the Florida Board of Education banned federal or state money being used toward DEI programs in universities.
Another significant point of political controversy has been the implementation of DEI frameworks in the military, with Republican politicians frequently criticizing the efforts as "divisive" and as harming military efficiency and recruiting, while Democrats have defended it as beneficial and strengthening.
In July 2023, the House of Representatives voted to ban all DEI offices and initiatives within the Pentagon and military along partisan lines, with all Democrats and four Republican members also opposing.
The Senate, under Democratic control, has not acted.
Public boycotts:
Political opposition to corporate DEI efforts in the United States, particularly marketing criticized as "woke", have led to calls for boycotts of certain companies by activists and politicians; with notable examples being Disney, Target, Anheuser-Busch, and Chick-fil-A.
Commentator Jonathan Turley of The Hill described such boycotts as possessing "some success".
Some of these companies' responses to the controversies have, in turn, sparked criticism from progressives of "walking back" or failing DEI commitments.
See also:
These three notions (diversity, equity, and inclusion) together represent "three closely linked values" which organizations seek to institutionalize through DEI frameworks. Some experts say diversity and inclusion should be decoupled in some cases. Some frameworks, primarily in Britain, substitute the notion of "equity" with equality: equality, diversity, inclusion (EDI).
Other variations include:
- diversity,
- equity,
- inclusion and belonging (DEIB),
- justice,
- equity,
- diversity
- and inclusion (JEDI or EDIJ),
- or:
- diversity,
- equity,
- inclusion
- and access (IDEA, DEIA or DEAI).
Diversity refers to the presence of variety within the organizational workforce, such as in identity and identity politics. It includes:
Equity refers to concepts of fairness and justice, such as fair compensation and substantive equality. More specifically, equity usually also includes a focus on societal disparities and allocating resources and "decision making authority to groups that have historically been disadvantaged", and taking "into consideration a person's unique circumstances, adjusting treatment accordingly so that the end result is equal."
Finally, inclusion refers to creating an organizational culture that creates an experience where "all employees feel their voices will be heard", and a sense of belonging and integration.
DEI is most often used to describe certain "training" efforts, such as diversity training. Though DEI is best known as a form of corporate training, it also finds implementation within many types of organizations, such as within academia, schools, and hospitals.
In recent years, DEI efforts and policies have generated criticism, some directed at the specific effectiveness of its tools, such as diversity training, its effect on free speech and academic freedom, as well as more broadly attracting criticism on political or philosophical grounds.
History:
DEI policy emerged from Affirmative Action in the United States. The legal term "affirmative action" was first used in "Executive Order No. 10925", signed by President John F. Kennedy on 6 March 1961, which included a provision that government contractors "take affirmative action to ensure that applicants are employed, and employees are treated [fairly] during employment, without regard to their race, creed, color, or national origin". It was used to promote actions that achieve non-discrimination.
In September 1965, President Lyndon Johnson issued Executive Order 11246 which required government employers to "hire without regard to race, religion and national origin" and "take affirmative action to ensure that applicants are employed and that employees are treated during employment, without regard to their race, color, religion, sex or national origin."
The Civil Rights Act of 1964 prohibited discrimination on the basis of race, color, religion, sex or national origin. Neither executive order nor The Civil Rights Act authorized group preferences. The Senate floor manager of the bill, Senator Hubert Humphrey, declared that the bill “would prohibit preferential treatment for any particular group” adding “I will eat my hat if this leads to racial quotas.”
However affirmative action in practice would eventually become synonymous with preferences, goals and quotas as upheld or struck down by Supreme Court decisions even though no law had been passed explicitly permitting discrimination in favor of disadvantaged groups. Some state laws explicitly banned racial preferences, and in response some laws have failed attempting to explicitly legalize race preferences.
Affirmative action is intended to alleviate under-representation and to promote the opportunities of defined minority groups within a society to give them equal access to that of the majority population.
The philosophical basis of the policy has various rationales, including but not limited to compensation for past discrimination, correction of current discrimination, and the diversification of society. It is often implemented in governmental and educational settings to ensure that designated groups within a society can participate in all promotional, educational, and training opportunities.
The stated justification for affirmative action by its proponents is to help compensate for past discrimination, persecution or exploitation by the ruling class of a culture, and to address existing discrimination. More recently concepts have moved beyond discrimination to include diversity, equity and inclusion as motives for preferring historically underrepresented groups.
In the famous Bakke decision of 1978, Regents of the University of California v. Bakke, diversity now became a factor in constitutional law. The Supreme Court ruled quotas were illegal but it was allowable to consider race as a plus factor when trying to foster "diversity" in their classes.
Diversity themes gained momentum in the mid-1980s. At a time when President Ronald Reagan threatened to dismantle equality and affirmative action laws in the 1980s, equality and affirmative action professionals employed by US firms along with equality consultants, engaged in establishing the argument that a diverse workforce should be seen as a competitive advantage rather than just as a legal constraint.
Basically, their message was, do not promote diversity because it is a legal mandate, but because it is good for business . From then on, researchers started to test a number of hypotheses on the business benefits of diversity and of diversity management, known as the business case of diversity.
In 2003 corporations spent $8 billion annually on diversity. After the election of Donald Trump in 2016 and the ascent of the #MeToo and Black Lives Matter movements, Time magazine stated in 2019 that the DEI industry had "exploded" in size.
Within academia, a 2019 survey found that spending on DEI efforts had increased 27 percent over the five preceding academic years.
One 2020 estimate placed the size of the global diversity and inclusion market at $7.5 billion, of which $3.4 billion was in the United States, projecting it to reach $17.2 billion by 2027.
In 2021, New York magazine stated "the business became astronomically larger than ever" after the murder of George Floyd in May 2020. The Economist has also stated that surveys of international companies indicate that the number of people hired for jobs with "diversity" or "inclusion" in the title more than quadrupled since 2010.
As of 2024, affirmative action rhetoric has been increasingly replaced by emphasis on diversity, equity, and inclusion, while nine states explicitly ban its use in the employment process.
The Supreme Court in 2023 explicitly rejected affirmative action regarding race in college admissions in Students for Fair Admissions v. Harvard. The Court held that affirmative action programs "lack sufficiently focused and measurable objectives warranting the use of race, unavoidably employ race in a negative manner, involve racial stereotyping, and lack meaningful end points. We have never permitted admissions programs to work in that way, and we will not do so today".
Methods and arguments:
Further information:
In a 2018 article, proponents of DEI argued that because businesses and corporations exist within a larger world, they cannot be completely separated from the issues that exist in society. Therefore, the authors argue the need for DEI to improve coworker relations and teamwork.
Through a DEI plan, organizations outline measures to be taken, including recruiting and retaining personnel, fostering effective communication channels, imparting relevant training, and regulating workplace conduct.
As of 2022 many academic institutions in the US have also started making commitments to DEI in different ways, including creating documents, programs and appointing dedicated staff members, especially in the US.
Many accreditation agencies now require supporting DEI. As of 2014, information on DEI for both students and professors was widespread in colleges and universities, with many schools requiring training and meetings on the topic. Many scholarships and opportunities at universities even have a secondary purpose of encouraging diversity.
Diversity in higher education can be difficult, with diverse students often feeling reduced to fulfilling a 'diversity quota,' which can carry a high emotional tax.
Within healthcare, DEI reflective groups have been used to enhance the cultural sensitivity within mental health professionals. Such reflective spaces help improve mental health professionals reflexivity and awareness of DEI-related issues both within direct clinical work with clients, their families, and wider systems, as well as within professional supervision and teams.
DEI positions also exist with the goal of creating allies for public school students through resources and staff training, in order to support students facing social disparities. Other proponents of allyship consider impromptu speaking a key skill for allies to operate on authenticity in everyday words and reactions.
Corporate:
Main article: Diversity (business)
Diversity management as a concept appeared and gained momentum in the US in the mid-1980s. At a time when President Ronald Reagan threatened to dismantle equality and affirmative action laws in the US in the 1980s, equality and affirmative action professionals employed by US firms along with equality consultants, engaged in establishing the argument that a diverse workforce should be seen as a competitive advantage rather than just as a legal constraint.
Basically, their message was, do not promote diversity because it is a legal mandate, but because it is good for business (Kelly and Dobbin, 1998).
Following the murder of George Floyd in 2020, some companies made substantial commitments to racial equity by establishing dedicated diversity, equity, and inclusion teams. In early 2024 the Washington Post reported that there is a trend in corporate America to reduce DEI positions and delegate the work to external consultants.
The number of DEI jobs reached its highest point in early 2023, but subsequently decreased by 5 percent that year and has further shrunk by 8 percent in 2024. The attrition rate for DEI roles has been approximately twice as high as that of non-DEI positions.
The scaling back of DEI initiatives has aligned with a rise in legal challenges and political opposition to systematic endeavors aimed at enhancing racial equity.
Diversity management can be seen to "leverage organisational diversity to enhance organisational justice and achieve better business outcomes".
Several reports including by McKinsey & Company claim financial benefits of DEI, but are criticized for over-generalization, lack of rigour, and distinguishing between correlation and causality.
At an aggregate level, Alessina, Harnoss, and Rapoport (2013) have shown that birth country diversity of the labor force positively impacts a nation's long term productivity and income.
Firm-level research has provided conditional support to the proposal that workforce diversity per se brings business benefits with it. In short, whether diversity pays off or not depends on environmental factors, internal or external to the firm. Dwyer, Richard & Chadwyck (2003) found that the effects of gender diversity at the management level are conditional on the firm's strategic orientation, the organizational culture and the multivariate interaction among these variables.
Schäffner, Gebert, Schöler, & Kirch (2006) found that if the firm's culture incorporates the normative assumption or belief that diversity is an opportunity, then age diversity becomes a predictor of team innovativeness, but not otherwise.
Kearney & Gebert (2006) found that diversity in age, nationality, and functional background, have a positive effect on team innovativeness in a high transformational leadership context, but no effect in a low one.
A curvilinear relationship between diversity and performance was identified by Richard, Barnett, Dwyer, & Chadwick (2004).
Kochan, Bezrukova, Ely, Jackson, Joshi, Jehn et al. (2003), found few positive or negative direct effects of diversity on performance. In the cases that came under their scrutiny, a number of different aspects of the organizational context or group processes moderated the diversity-performance relationship.
Failing to manage diversity properly or developing diversity per se leads to only mixed results (Bell & Berry, 2007; Klein & Harrison, 2007), although Risberg & Corvellec show that approaching diversity management in terms of trying is a way to emphasize the performative dimension of diversity management beyond a reductionist dichotomy between success and failure.
Overall research suggests that diversity needs to be properly managed if any business benefits are to be reaped. If properly managed under the right conditions, diversity likely will hold its business promises. Given this conditional nature, the topic remains open to debate and further research.
Research indicates that attempts to promote diversity can provoke defensive responses: One study showed that even incidental allusions to diversity during interviews promoted defensive reactions in White male applicants. Indeed, after diversity was mentioned, their performance during the interview deteriorated and their physiological arousal increased.
Criticism and controversy:
According to The Chronicle of Higher Education, institutions are making defensive adjustments to the criticism. Some schools are removing the word “diversity” from titles of offices and jobs; some are closing campus spaces set up for students according to identity; some are ending diversity training; and some have stopped asking all faculty and staff members for written affirmations of their commitment to diversity.
Diversity training:
Further information: Diversity training
Diversity training, a common tool used in DEI efforts, has repeatedly come under criticism as being ineffective or even counterproductive. The Economist has stated that "the consensus now emerging among academics is that many anti-discrimination policies have no effect. What is worse, they often backfire".
A regular claim is that these efforts mainly work to protect against litigation. It has also been criticized that there has been limited progress in achieving racial diversity in corporate leadership, particularly for Black professionals, due to a lack of diverse Chief Diversity Officers and a broad DEI focus that overlooks specific issues Black professionals face.
A 2007 study of 829 companies over 31 years showed "no positive effects in the average workplace" from diversity training, while the effect was negative where it was mandatory.
According to Harvard University professor in sociology and diversity researcher Frank Dobbin, "[O]n average, the typical all-hands-on-deck, 'everybody has to have diversity training'—that typical format in big companies doesn't have any positive effects on any historically underrepresented groups like black men or women, Hispanic men or women, Asian-American men or women or white women."
Mandatory diversity statements within academia:
The use of mandatory "diversity statements" within academia, wherein an applicant or faculty member outlines their "past contributions" and plans "for advancing diversity, equity and inclusion" if hired, has become controversial and sparked criticism.
The Foundation for Individual Rights and Expression (FIRE) has called such practices an attack on academic freedom, stating that "[v]ague or ideologically motivated DEI statement policies can too easily function as litmus tests for adherence to prevailing ideological views on DEI" and "penalize faculty for holding dissenting opinions on matters of public concern".
According to a 2022 survey conducted by the American Association of University Professors, one in five American colleges and universities include DEI criteria in tenure standards, including 45.6% of institutions with more than 5000 students.
Some universities have begun to weigh diversity statements heavily in hiring processes. For example, University of California, Berkeley eliminated three-quarters of applicants for five faculty positions in the life sciences exclusively on the basis of their diversity statements in the hiring cycle of 2018–2019.
The Academic Freedom Alliance (AFA) has called for the end of required diversity statements, stating it "encourages cynicism and dishonesty" and erases "the distinction between academic expertise and ideological conformity".
Social psychologist Jonathan Haidt, who resigned from the Society for Personality and Social Psychology in protest against mandatory diversity statements, has stated that "most academic work has nothing to do with diversity, so these mandatory statements force many academics to betray their quasi-fiduciary duty to the truth by spinning, twisting, or otherwise inventing some tenuous connection to diversity".
Other criticisms include that it "devalues merit"; is connected to affirmative action; that it violates the First Amendment; or functions as a loyalty oath.
A 1500-person survey conducted by FIRE reported that the issue is highly polarizing for faculty members, with half saying their view more closely aligns with the description of diversity statements as "a justifiable requirement for a job at a university", while the other half saw it as "an ideological litmus test that violates academic freedom".
According to Professor Randall L. Kennedy at Harvard University, "many academics at Harvard and beyond feel intense and growing resentment against the DEI enterprise because of features that are perhaps most evident in the demand for DEI statements", stating "I am a scholar on the left committed to struggles for social justice. The realities surrounding mandatory DEI statements, however, make me wince".
Several U.S. states have implemented legislation to ban mandatory diversity statements. In 2024, MIT announced that diversity statements "will no longer be part of applications for any faculty positions" at the university, becoming the first major university to abandon the practice.
Equity versus equality:
According to DEI frameworks, "equity is different than equality in that equality implies treating everyone as if their experiences are exactly the same." A common identification, especially among critics, is of equality as meaning "equality of opportunities" and equity as "equality of outcome".
This difference between equity and equality is also called Dilemma of Difference. Some have criticized the focus on equity rather than equality, arguing that the former runs contrary to a focus on merit or non-discrimination.
Political scientist Charles Lipson has called "equity" a "mandate to discriminate", threatening the principle of "equality under the law", while Canadian psychologist Jordan Peterson, a frequent critic of DEI, has called equity "the most egregious, self-righteous, historically-ignorant and dangerous" of the three titular notions of DEI.
The debate has also branched into the realm of politics. Commenting on Governor of Texas Greg Abbott calling DEI initiatives "illegal", a spokesperson for his office stated "The issue is not diversity — the issue is that equity is not equality. Here in Texas, we give people a chance to advance based on talent and merit".
Effects of DEI policies on free speech and academic freedom:
In recent years, high-profile incidents of campus conflict have sparked debate about the effect of DEI on the campus environment, academic freedom, and free speech.
The 2021 cancelling of a Massachusetts Institute of Technology (MIT) guest lecture by astrophysicist Dorian Abbot after he criticized DEI programs led to media attention and controversy. As a result, MIT empaneled a committee to investigate the state of academic freedom at the university.
The 2023 disruption of a talk by Fifth Circuit Court of Appeals Judge Kyle Duncan at Stanford Law School sparked criticism and discussion in the media, with many focusing on the role of Associate DEI Dean Tirien Steinbach, who joined protesters in denouncing Duncan's presence on campus.
In the wake of the incident, the editorial board of the Wall Street Journal opined that DEI offices have "become weapons to intimidate and limit speech". Steinbach replied with a piece entitled "Diversity and Free Speech Can Coexist at Stanford" that was published in the Journal the following week.
Dean of Stanford Law School Jenny S. Martínez also published a ten-page document addressing the situation and clarifying Stanford's position on free speech. In it, Martinez stated that the university's commitment to DEI "can and should be implemented in ways that are consistent with its commitment to academic freedom and free speech" and that she believed that "the commitment to diversity, equity, and inclusion actually means that we must protect free expression of all views."
She added that the commitment would not take the form of "having the school administration announce institutional positions on a wide range of current social and political issues, make frequent institutional statements about current news events, or exclude or condemn speakers who hold views on social and political issues with whom some or even many in our community disagree."
She criticized this definition of an "inclusive environment" by stating it "can lead to creating and enforcing an institutional orthodoxy."
In April 2023, a group of 29 scientists, including Nobel laureates Dan Shechtman and Arieh Warshel, published a paper that outlined what the authors see as a "clash in science between classical liberal values" and a "new postmodern worldview", which, they argue, is "enforced by 'Diversity, Equity, and Inclusion' (DEI) officers and bureaucracies" and "threatens the entire scientific enterprise."
Two of the authors, Anna Krylov and Jerry Coyne, subsequently argued in an op-ed in the Wall Street Journal that their emphasis on merit--"once anodyne and unobjectionable [...] now contentious and outré, even in the hard sciences"—led to its refusal by major journals and subsequent publication in the Journal of Controversial Ideas.
The 2023 suicide of former Toronto principal Richard Bilkszto led to a new wave of controversy surrounding DEI in the workplace and its impact on freedom of expression.
Bilkszto had earlier filed a lawsuit against the Toronto District School Board in the wake of a 2021 incident at a DEI training seminar; Bilkszto was later diagnosed with "anxiety secondary to a workplace event", and claimed the session and its aftermath had destroyed his reputation.
Bilkszto's lawyer has publicly linked this incident and its aftermath with his death.
In the wake of Bilkszto's death, Ontario Minister of Education Stephen Lecce stated he had asked for a review and "options to reform professional training and strengthen accountability on school boards so this never happens again", calling Bilkszto's allegations before his death "serious and disturbing".
Bilkszto's death generated international attention and renewed debate on DEI and freedom of speech. According to The Globe and Mail, the incident has also been "seized on by a number of prominent right-wing commentators looking to roll-back [DEI] initiatives."
The anti-racism trainer involved in the incident has stated they welcome the review by Lecce, and stated that the incident has been "weaponized to discredit and suppress the work of people committed [to DEI]".
Antisemitism:
DEI has been accused of ignoring or even contributing to antisemitism. According to Andria Spindel of the Canadian Antisemitism Education Foundation, antisemitism has been largely ignored in the DEI curriculum.
The relationship between DEI and campus antisemitism came under further scrutiny after the October 7, 2023, Hamas attack on Israel and subsequent war in Gaza.
Tabia Lee, a former DEI director at De Anza College in California and DEI critic, has claimed that DEI frameworks foster antisemitism through its "oppressors and the oppressed" dichotomy whereby "Jews are categorically placed in the oppressor category" and described as "white oppressors".
She has claimed that her attempts to include Jews under the DEI umbrella were resisted. When her critics asked the college trustees to oust her from her role, one counselor explicitly referenced her attempts to place Jewish students "on the same footing as marginalized groups".
The Brandeis Center likewise notes how the DEI committee at Stanford University alleged that "Jews, unlike other minority group[s], possess privilege and power, Jews and victims of Jew-hatred do not merit or necessitate the attention of the DEI committee" after two students complained about antisemitic incidents on campus.
Following a wave of antisemitic incidents on American campuses in 2023–2024, several Republican congressmen laid the blame on DEI, with Burgess Owens stating DEI programs "are anything but inclusive for Jews".
DEI's lack of inclusion of Jews and contribution to antisemitism were similarly criticized by businessman Bill Ackman and columnist Heather Mac Donald. Following the antisemitism controversy at the University of Pennsylvania, one donor pulled a $100 million donation "because he thought the school was prioritizing D.E.I. over enhancing the business school's academic excellence."
Politicization and ideology:
DEI has according to some critics become a distinct ideology or "political agenda", leading to a politicization of universities. Fareed Zakaria, a commentator on CNN, has criticized American universities for "[h]aving gone so far down the ideological path" that "these universities and these presidents cannot make the case clearly that at the center of a university is the free expression of ideas." He opines that "[t]he most obvious lack of diversity at universities, political diversity, which clearly affects their ability to analyze many issues, is not addressed."
Dog-whistle diversity:
Author Christine Michel Carter coined the term "dog-whistle diversity" for TIME in 2017. Influenced by the phrase "dog whistle politics," dog whistle diversity is defined as the hiring of groups who have historically been underrepresented or subject to discrimination by organizations for the social aspect of environmental, social, and corporate governance (ESG).
To investors and stakeholders, hiring these groups sends a coded message that the organization is more open to a diverse workforce, but to the groups hired it suggests the organization lacks effective diversity management or inclusion.
Disability community:
According to some critics, DEI initiatives inadvertently sideline disabled people. Writing for The Conversation in 2017, college professor Stephen Friedman said that, "Organizations who are serious about DEI must adopt the frame of producing shared value where business and social goods exist side-by-side".
According to a Time article in 2023, "People with disabilities are being neglected".
This view has been echoed by a number of DEI leaders and activists. Sara Hart Weir, the former president and CEO of the National Down Syndrome Society and co-founder of the Commission for Disability Employment, argues that when deliberating on the vision of DEI success in the United States, policymakers, and employers need to take proactive measures to engaging with people with disabilities who they historically ignored.
Corinne Gray has argued that, "If you embrace diversity, but ignore disability, you're doing it wrong."
Political and public reaction:
Higher education:
Since 2023, Republican-dominated state legislatures are considering bills that are against DEI efforts, primarily at state colleges and universities. The downgrading is taking place amid heavy legal pressures. Supreme Court in June 2023 upended established equal protection law with its decision in Students for Fair Admissions v. Harvard.
This ruling, effectively eliminated the use of affirmative action in college admissions but did not directly affect employers. Nevertheless, since then conservative activists have organized in the states to dismantle race-conscious policies in various aspects of the economy.
The Chronicle of Higher Education in February 2024 is tracking 73 bills introduced in state legislatures in 2023-2024. Of these 8 have become law, 25 failed to pass, and the rest are pending. Two bills became law in Florida and Texas; and one each in North Carolina, North Dakota, Tennessee, and Utah. Florida now prohibits public colleges from requiring “political loyalty tests” as a condition of employment, admission, or promotion.
The other Florida law prohibits public colleges from spending state or federal funds on DEI unless required by federal law. One Texas law prohibits DEI practices or programs, including training, that are not in compliance with the state Constitution regarding equality. The other law bans DEI offices and staff, as well as mandatory diversity training. It also bans identity-based diversity statements that give preference regarding race or sex.
Entertainment and media:
Within the film industry, several prominent actors and directors have criticized recently implemented diversity standards, such as at the Academy Awards. Beginning in 2024, to be eligible for a best-picture nomination at the Academy Awards, a film must meet two of four diversity standards in order to qualify.
Actor Richard Dreyfuss stated the Academy Award's diversity and inclusion standards "make me vomit", arguing that art should not be morally legislated. Several major film directors, who are voting members of the Academy Awards, anonymously expressed their opposition to the new diversity standards to The New York Post, with one describing them as "contrived".
Film critic Armond White attacked the new standards as "progressive fascism", comparing them to the Hays Code.
Conservative media sources, such as National Review, have also been frequent critics of DEI, with contributor George Leff arguing it is authoritarian and anti-meritocratic.
Politics:
In the 2020s, DEI came into the spotlight in American politics, especially in state legislatures in Texas and other Republican-controlled states. Several states are considering or have passed legislation targeting DEI in public institutions.
In March 2023, the Texas House of Representatives passed a bill with a rider banning the use of state funds for DEI programs in universities and colleges. In May 2023, Texas passed legislation banning offices and programs promoting DEI at publicly funded colleges and universities.
In Iowa, a bill to ban spending on DEI in public universities was also advanced in March 2023.
Several prominent Republicans positioned themselves as critics, including Florida Governor Ron DeSantis, Texas Governor Greg Abbott, and 2024 presidential candidate Vivek Ramaswamy.
In January 2024 the Florida Board of Education banned federal or state money being used toward DEI programs in universities.
Another significant point of political controversy has been the implementation of DEI frameworks in the military, with Republican politicians frequently criticizing the efforts as "divisive" and as harming military efficiency and recruiting, while Democrats have defended it as beneficial and strengthening.
In July 2023, the House of Representatives voted to ban all DEI offices and initiatives within the Pentagon and military along partisan lines, with all Democrats and four Republican members also opposing.
The Senate, under Democratic control, has not acted.
Public boycotts:
Political opposition to corporate DEI efforts in the United States, particularly marketing criticized as "woke", have led to calls for boycotts of certain companies by activists and politicians; with notable examples being Disney, Target, Anheuser-Busch, and Chick-fil-A.
Commentator Jonathan Turley of The Hill described such boycotts as possessing "some success".
Some of these companies' responses to the controversies have, in turn, sparked criticism from progressives of "walking back" or failing DEI commitments.
See also:
- Affirmative action
- Corporate Equality Index
- Environmental, social, and corporate governance
- Corporate social responsibility
- Fair-chance employer – employer that does not automatically disqualify all people with any criminal background
- Health equity
- Human resources
- Rainbow capitalism
- Title IX, regarding sex discrimination
Social Class(es) vs. Class Conflict Pictured below: Social Classes Puzzle | Economics Learning Game:
Social class:
A social class or social stratum is a grouping of people into a set of hierarchical social categories, the most common being the:
Membership of a social class can for example be dependent on education, wealth, occupation, income, and belonging to a particular subculture or social network.
Class is a subject of analysis for:
The term has a wide range of sometimes conflicting meanings, and there is no broad consensus on a definition of class.
Some people argue that due to social mobility, class boundaries do not exist. In common parlance, the term social class is usually synonymous with socioeconomic class, defined as "people having the same social, economic, cultural, political or educational status", e.g. the working class, "an emerging professional class" etc.
However, academics distinguish social class from socioeconomic status, using the former to refer to one's relatively stable cultural background and the latter to refer to one's current social and economic situation which is consequently more changeable over time.
The precise measurements of what determines social class in society have varied over time. Karl Marx defined class by one's relationship to the means of production (their relations of production). His understanding of classes in modern capitalist society is that the proletariat work but do not own the means of production, and the bourgeoisie, those who invest and live off the surplus generated by the proletariat's operation of the means of production, do not work at all.
This contrasts with the view of the sociologist Max Weber, who contrasted class as determined by economic position, with social status (Standing) which is determined by social prestige rather than simply just relations of production.
The term class is etymologically derived from the Latin classis, which was used by census takers to categorize citizens by wealth in order to determine military service obligations.
In the late 18th century, the term class began to replace classifications such as estates, rank and orders as the primary means of organizing society into hierarchical divisions. This corresponded to a general decrease in significance ascribed to hereditary characteristics and increase in the significance of wealth and income as indicators of position in the social hierarchy.
The existence of social classes is considered normal in many societies, both historic and modern, to varying degrees. It is important to note that not all systems that have social classes are the same, rather, they can vary to each other in substantial ways that reflect the respective cultures of said societies, at times even only having the shared creation of social class in common.
Click on any of the following blue hyperlinks for about Social Classes:
Class Conflict:
Pictured below: The Pyramid of Capitalist System visualizes and explains class conflict.
A social class or social stratum is a grouping of people into a set of hierarchical social categories, the most common being the:
Membership of a social class can for example be dependent on education, wealth, occupation, income, and belonging to a particular subculture or social network.
Class is a subject of analysis for:
The term has a wide range of sometimes conflicting meanings, and there is no broad consensus on a definition of class.
Some people argue that due to social mobility, class boundaries do not exist. In common parlance, the term social class is usually synonymous with socioeconomic class, defined as "people having the same social, economic, cultural, political or educational status", e.g. the working class, "an emerging professional class" etc.
However, academics distinguish social class from socioeconomic status, using the former to refer to one's relatively stable cultural background and the latter to refer to one's current social and economic situation which is consequently more changeable over time.
The precise measurements of what determines social class in society have varied over time. Karl Marx defined class by one's relationship to the means of production (their relations of production). His understanding of classes in modern capitalist society is that the proletariat work but do not own the means of production, and the bourgeoisie, those who invest and live off the surplus generated by the proletariat's operation of the means of production, do not work at all.
This contrasts with the view of the sociologist Max Weber, who contrasted class as determined by economic position, with social status (Standing) which is determined by social prestige rather than simply just relations of production.
The term class is etymologically derived from the Latin classis, which was used by census takers to categorize citizens by wealth in order to determine military service obligations.
In the late 18th century, the term class began to replace classifications such as estates, rank and orders as the primary means of organizing society into hierarchical divisions. This corresponded to a general decrease in significance ascribed to hereditary characteristics and increase in the significance of wealth and income as indicators of position in the social hierarchy.
The existence of social classes is considered normal in many societies, both historic and modern, to varying degrees. It is important to note that not all systems that have social classes are the same, rather, they can vary to each other in substantial ways that reflect the respective cultures of said societies, at times even only having the shared creation of social class in common.
Click on any of the following blue hyperlinks for about Social Classes:
- History
- Class society
- Theoretical models
- Consequences of class position
- Classless society
- Relationship between ethnicity and class
- See also:
Class Conflict:
Pictured below: The Pyramid of Capitalist System visualizes and explains class conflict.
In political science, the term class conflict, or class struggle, refers to the economic antagonism and political tension that exist among social classes because of clashing interests, competition for limited resources, and inequalities of power in the socioeconomic hierarchy.
In its simplest manifestation, class conflict refers to the ongoing battle between rich and poor.
In the writings of Karl Marx and Mikhail Bakunin, class struggle is a core tenet and a practical means for effecting radical sociopolitical transformations for the majority working class. It is also a central concept within conflict theories of sociology and political philosophy.
Class conflict can reveal itself through:
In the economic sphere, class conflict is sometimes expressed overtly:
Click on any of the following blue hyperlinks for more about Class Conflict:
In its simplest manifestation, class conflict refers to the ongoing battle between rich and poor.
In the writings of Karl Marx and Mikhail Bakunin, class struggle is a core tenet and a practical means for effecting radical sociopolitical transformations for the majority working class. It is also a central concept within conflict theories of sociology and political philosophy.
Class conflict can reveal itself through:
- (a) direct violence, such as
- assassinations,
- coups,
- revolutions,
- counterrevolutions, and
- civil wars
- for control of government, natural resources, and labor;
- (b) indirect violence, such as:
- deaths from poverty,
- malnutrition,
- illness,
- and unsafe workplaces;
- (c) economic coercion, such as
- boycotts and strikes,
- the threat of unemployment and capital flight,
- the withdrawal of investment capital;
- (d) political machinations through
- lobbying (legal and illegal),
- bribery of legislators,
- voter suppression
- and disenfranchisement;
- and (e) ideological struggle by way of propaganda and political literature.
In the economic sphere, class conflict is sometimes expressed overtly:
- such as owner lockouts of their employees in an effort to weaken the bargaining power of the employees' union;
- or covertly, such as a worker slowdown of production or the widespread, simultaneous use of sick leave (e.g., "blue flu") to protest
- unfair labor practices,
- low wages,
- poor work conditions,
- or a perceived injustice to a fellow worker.
Click on any of the following blue hyperlinks for more about Class Conflict:
- Oligarchs versus commoners in Ancient Greece
- Patricians versus plebeians in Ancient Rome
- Enlightenment era
- Capitalist societies
- Twentieth and twenty-first centuries
- Relationship to race
- Chronology
- See also: