Copyright © 2015 Bert N. Langford (Images may be subject to copyright. Please send feedback)
Welcome to Our Generation USA!
This Web Page
"Being Human"
covers how we,
both collectively and individually, have evolved to become today's human beings!
See also related web pages:
American Lifestyles
Medical Breakthroughs
Civilization
Human Sexuality
Worst of Humanity
Human Beings, including a Timeline
YouTube Video: Evolution - from ape man to neanderthal - BBC science
Modern humans (Homo sapiens, primarily ssp. Homo sapiens sapiens) are the only extant members of the subtribe Hominina, a branch of the tribe Hominini belonging to the family of great apes.
Humans are characterized by erect posture and bipedal locomotion; high manual dexterity and heavy tool use compared to other animals; and a general trend toward larger, more complex brains and societies.
Early hominins—particularly the australopithecines, whose brains and anatomy are in many ways more similar to ancestral non-human apes—are less often referred to as "human" than hominins of the genus Homo. Several of these hominins used fire, occupied much of Eurasia, and gave rise to anatomically modern Homo sapiens in Africa about 200,000 years ago. They began to exhibit evidence of behavioral modernity around 50,000 years ago. In several waves of migration, anatomically modern humans ventured out of Africa and populated most of the world.
The spread of humans and their large and increasing population has had a profound impact on large areas of the environment and millions of native species worldwide.
Advantages that explain this evolutionary success include a relatively larger brain with a particularly well-developed neocortex, prefrontal cortex and temporal lobes, which enable high levels of abstract reasoning, language, problem solving, sociality, and culture through social learning.
Humans use tools to a much higher degree than any other animal, are the only extant species known to build fires and cook their food, and are the only extant species to clothe themselves and create and use numerous other technologies and arts.
Humans are uniquely adept at utilizing systems of symbolic communication (such as language and art) for self-expression and the exchange of ideas, and for organizing themselves into purposeful groups. Humans create complex social structures composed of many cooperating and competing groups, from families and kinship networks to political states.
Social interactions between humans have established an extremely wide variety of values, social norms, and rituals, which together form the basis of human society. Curiosity and the human desire to understand and influence the environment and to explain and manipulate phenomena (or events) has provided the foundation for developing science, philosophy, mythology, religion, anthropology, and numerous other fields of knowledge.
Though most of human existence has been sustained by hunting and gathering in band societies, increasing numbers of human societies began to practice sedentary agriculture approximately some 10,000 years ago, domesticating plants and animals, thus allowing for the growth of civilization. These human societies subsequently expanded in size, establishing various forms of government, religion, and culture around the world, unifying people within regions to form states and empires.
The rapid advancement of scientific and medical understanding in the 19th and 20th centuries led to the development of fuel-driven technologies and increased lifespans, causing the human population to rise exponentially. Today the global human population is estimated by the United Nations to be near 7.5 billion.
In common usage, the word "human" generally refers to the only extant species of the genus Homo—anatomically and behaviorally modern Homo sapiens.
In scientific terms, the meanings of "hominid" and "hominin" have changed during the recent decades with advances in the discovery and study of the fossil ancestors of modern humans.
The previously clear boundary between humans and apes has blurred, resulting in now acknowledging the hominids as encompassing multiple species, and Homo and close relatives since the split from chimpanzees as the only hominins. There is also a distinction between anatomically modern humans and Archaic Homo sapiens, the earliest fossil members of the species.
The English adjective human is a Middle English term from Old French humain, ultimately from Latin hūmānus, the adjective form of homō "man." The word's use as a noun (with a plural: humans) dates to the 16th century. The native English term man can refer to the species generally (a synonym for humanity), and could formerly refer to specific individuals of either sex, though this latter use is now obsolete.
The species binomial Homo sapiens was coined by Carl Linnaeus in his 18th century work Systema Naturae. The generic name Homo is a learned 18th century derivation from Latin homō "man," ultimately "earthly being". The species-name sapiens means "wise" or "sapient." Note that the Latin word homo refers to humans of either gender, and that sapiens is the singular form (while there is no such word as sapien).
Evolution and range -- Main article: Human evolution
Further information: Anthropology, Homo (genus), and Timeline of human evolution.
The genus Homo evolved and diverged from other hominins in Africa, after the human clade split from the chimpanzee lineage of the hominids (great apes) branch of the primates.
Modern humans, defined as the species Homo sapiens or specifically to the single extant subspecies Homo sapiens sapiens, proceeded to colonize all the continents and larger islands, arriving in Eurasia 125,000–60,000 years ago, Australia around 40,000 years ago, the Americas around 15,000 years ago, and remote islands such as Hawaii, Easter Island, Madagascar, and New Zealand between the years 300 and 1280.
The closest living relatives of humans are chimpanzees (genus Pan) and gorillas (genus Gorilla). With the sequencing of both the human and chimpanzee genome, current estimates of similarity between human and chimpanzee DNA sequences range between 95% and 99%.
By using the technique called a molecular clock which estimates the time required for the number of divergent mutations to accumulate between two lineages, the approximate date for the split between lineages can be calculated. The gibbons (family Hylobatidae) and orangutans (genus Pongo) were the first groups to split from the line leading to the humans, then gorillas (genus Gorilla) followed by the chimpanzees (genus Pan).
The splitting date between human and chimpanzee lineages is placed around 4–8 million years ago during the late Miocene epoch. During this split, chromosome 2 was formed from two other chromosomes, leaving humans with only 23 pairs of chromosomes, compared to 24 for the other apes.
Evidence from the fossil record: There is little fossil evidence for the divergence of the gorilla, chimpanzee and hominin lineages. The earliest fossils that have been proposed as members of the hominin lineage are Sahelanthropus tchadensis dating from 7 million years ago, Orrorin tugenensis dating from 5.7 million years ago, and Ardipithecus kadabba dating to 5.6 million years ago.
Each of these species has been argued to be a bipedal ancestor of later hominins, but all such claims are contested. It is also possible that any one of the three is an ancestor of another branch of African apes, or is an ancestor shared between hominins and other African Hominoidea (apes).
The question of the relation between these early fossil species and the hominin lineage is still to be resolved. From these early species the australopithecines arose around 4 million years ago diverged into robust (also called Paranthropus) and gracile branches, possibly one of which (such as A. garhi, dating to 2.5 million years ago) is a direct ancestor of the genus Homo.
The earliest members of the genus Homo are Homo habilis which evolved around 2.8 million years ago. Homo habilis has been considered the first species for which there is clear evidence of the use of stone tools. More recently, however, in 2015, stone tools, perhaps predating Homo habilis, have been discovered in northwestern Kenya that have been dated to 3.3 million years old. Nonetheless, the brains of Homo habilis were about the same size as that of a chimpanzee, and their main adaptation was bipedalism as an adaptation to terrestrial living.
During the next million years a process of encephalization began, and with the arrival of Homo erectus in the fossil record, cranial capacity had doubled. Homo erectus were the first of the hominina to leave Africa, and these species spread through Africa, Asia, and Europe between 1.3 to 1.8 million years ago.
One population of H. erectus, also sometimes classified as a separate species Homo ergaster, stayed in Africa and evolved into Homo sapiens. It is believed that these species were the first to use fire and complex tools.
The earliest transitional fossils between H. ergaster/erectus and archaic humans are from Africa such as Homo rhodesiensis, but seemingly transitional forms are also found at Dmanisi, Georgia. These descendants of African H. erectus spread through Eurasia from ca. 500,000 years ago evolving into H. antecessor, H. heidelbergensis and H. neanderthalensis.
The earliest fossils of anatomically modern humans are from the Middle Paleolithic, about 200,000 years ago such as the Omo remains of Ethiopia and the fossils of Herto sometimes classified as Homo sapiens idaltu. Later fossils of archaic Homo sapiens from Skhul in Israel and Southern Europe begin around 90,000 years ago.
Anatomical adaptations:
Human evolution is characterized by a number of morphological, developmental, physiological, and behavioral changes that have taken place since the split between the last common ancestor of humans and chimpanzees.
The most significant of these adaptations are 1. bipedalism, 2. increased brain size, 3. lengthened ontogeny (gestation and infancy), 4. decreased sexual dimorphism (neoteny). The relationship between all these changes is the subject of ongoing debate. Other significant morphological changes included the evolution of a power and precision grip, a change first occurring in H. erectus.
Bipedalism is the basic adaption of the hominin line, and it is considered the main cause behind a suite of skeletal changes shared by all bipedal hominins. The earliest bipedal hominin is considered to be either Sahelanthropus or Orrorin, with Ardipithecus, a full bipedal, coming somewhat later.
The knuckle walkers, the gorilla and chimpanzee, diverged around the same time, and either Sahelanthropus or Orrorin may be humans' last shared ancestor with those animals. The early bipedals eventually evolved into the australopithecines and later the genus Homo.
There are several theories of the adaptational value of bipedalism. It is possible that bipedalism was favored because it freed up the hands for reaching and carrying food, because it saved energy during locomotion, because it enabled long distance running and hunting, or as a strategy for avoiding hyperthermia by reducing the surface exposed to direct sun.
The human species developed a much larger brain than that of other primates—typically 1,330 cm3 (81 cu in) in modern humans, over twice the size of that of a chimpanzee or gorilla.
The pattern of encephalization started with Homo habilis which at approximately 600 cm3 (37 cu in) had a brain slightly larger than chimpanzees, and continued with Homo erectus (800–1,100 cm3 (49–67 cu in)), and reached a maximum in Neanderthals with an average size of 1,200–1,900 cm3 (73–116 cu in), larger even than Homo sapiens (but less encephalized).
The pattern of human postnatal brain growth differs from that of other apes (heterochrony), and allows for extended periods of social learning and language acquisition in juvenile humans. However, the differences between the structure of human brains and those of other apes may be even more significant than differences in size.
The increase in volume over time has affected different areas within the brain unequally – the temporal lobes, which contain centers for language processing have increased disproportionately, as has the prefrontal cortex which has been related to complex decision making and moderating social behavior.
Encephalization has been tied to an increasing emphasis on meat in the diet, or with the development of cooking, and it has been proposed that intelligence increased as a response to an increased necessity for solving social problems as human society became more complex.
The reduced degree of sexual dimorphism is primarily visible in the reduction of the male canine tooth relative to other ape species (except gibbons). Another important physiological change related to sexuality in humans was the evolution of hidden estrus.
Humans are the only ape in which the female is fertile year round, and in which no special signals of fertility are produced by the body (such as genital swelling during estrus). Nonetheless humans retain a degree of sexual dimorphism in the distribution of body hair and subcutaneous fat, and in the overall size, males being around 25% larger than females.
These changes taken together have been interpreted as a result of an increased emphasis on pair bonding as a possible solution to the requirement for increased parental investment due to the prolonged infancy of offspring.
Rise of Homo sapiens:
Further information:
World map of early human migrations according to mitochondrial population genetics (numbers are millennia before present, the North Pole is at the center).
By the beginning of the Upper Paleolithic period (50,000 BP), full behavioral modernity, including language, music and other cultural universals had developed. As modern humans spread out from Africa they encountered other hominids such as Homo neanderthalensis and the so-called Denisovans.
The nature of interaction between early humans and these sister species has been a long-standing source of controversy, the question being whether humans replaced these earlier species or whether they were in fact similar enough to interbreed, in which case these earlier populations may have contributed genetic material to modern humans.
Recent studies of the human and Neanderthal genomes suggest gene flow between archaic Homo sapiens and Neanderthals and Denisovans. In March 2016, studies were published that suggest that modern humans bred with hominins, including Denisovans and Neanderthals, on multiple occasions.
This dispersal out of Africa is estimated to have begun about 70,000 years BP from Northeast Africa. Current evidence suggests that there was only one such dispersal and that it only involved a few hundred individuals. The vast majority of humans stayed in Africa and adapted to a diverse array of environments. Modern humans subsequently spread globally, replacing earlier hominins (either through competition or hybridization). They inhabited Eurasia and Oceania by 40,000 years BP, and the Americas at least 14,500 years BP.
Transition to civilization:
Main articles: Neolithic Revolution and Cradle of civilization
Further information: History of the world
The rise of agriculture, and domestication of animals, led to stable human settlements. Until about 10,000 years ago, humans lived as hunter-gatherers. They gradually gained domination over much of the natural environment. They generally lived in small nomadic groups known as band societies, often in caves.
The advent of agriculture prompted the Neolithic Revolution, when access to food surplus led to the formation of permanent human settlements, the domestication of animals and the use of metal tools for the first time in history. Agriculture encouraged trade and cooperation, and led to complex society.
The early civilizations of Mesopotamia, Egypt, India, China, Maya, Greece and Rome were some of the cradles of civilization. The Late Middle Ages and the Early Modern Period saw the rise of revolutionary ideas and technologies.
Over the next 500 years, exploration and European colonialism brought great parts of the world under European control, leading to later struggles for independence. The concept of the modern world as distinct from an ancient world is based on a rapid change progress in a brief period of time in many areas.
Advances in all areas of human activity prompted new theories such as evolution and psychoanalysis, which changed humanity's views of itself.
The Scientific Revolution, Technological Revolution and the Industrial Revolution up until the 19th century resulted in independent discoveries such as imaging technology, major innovations in transport, such as the airplane and automobile; energy development, such as coal and electricity. This correlates with population growth (especially in America) and higher life expectancy, the World population rapidly increased numerous times in the 19th and 20th centuries as nearly 10% of the 100 billion people lived in the past century.
With the advent of the Information Age at the end of the 20th century, modern humans live in a world that has become increasingly globalized and interconnected. As of 2010, almost 2 billion humans are able to communicate with each other via the Internet, and 3.3 billion by mobile phone subscriptions.
Although interconnection between humans has encouraged the growth of science, art, discussion, and technology, it has also led to culture clashes and the development and use of weapons of mass destruction.
Human civilization has led to environmental destruction and pollution significantly contributing to the ongoing mass extinction of other forms of life called the Holocene extinction event, which may be further accelerated by global warming in the future.
Click on any of the following for more about Human Beings:
Timeline of human evolution
The timeline of human evolution outlines the major events in the development of the human species, Homo sapiens, and the evolution of our ancestors. It includes brief explanations of some of the species, genera, and the higher ranks of taxa that are seen today as possible ancestors of modern humans.
This timeline is based on studies from anthropology, paleontology, developmental biology, morphology, and from anatomical and genetic data. It does not address the origin of life, which discussion is provided by abiogenesis, but presents one possible line of evolutionary descent of species that eventually led to humans.
Click on any of the following blue hyperlinks for more about the Timeline of Human Evolution:
Humans are characterized by erect posture and bipedal locomotion; high manual dexterity and heavy tool use compared to other animals; and a general trend toward larger, more complex brains and societies.
Early hominins—particularly the australopithecines, whose brains and anatomy are in many ways more similar to ancestral non-human apes—are less often referred to as "human" than hominins of the genus Homo. Several of these hominins used fire, occupied much of Eurasia, and gave rise to anatomically modern Homo sapiens in Africa about 200,000 years ago. They began to exhibit evidence of behavioral modernity around 50,000 years ago. In several waves of migration, anatomically modern humans ventured out of Africa and populated most of the world.
The spread of humans and their large and increasing population has had a profound impact on large areas of the environment and millions of native species worldwide.
Advantages that explain this evolutionary success include a relatively larger brain with a particularly well-developed neocortex, prefrontal cortex and temporal lobes, which enable high levels of abstract reasoning, language, problem solving, sociality, and culture through social learning.
Humans use tools to a much higher degree than any other animal, are the only extant species known to build fires and cook their food, and are the only extant species to clothe themselves and create and use numerous other technologies and arts.
Humans are uniquely adept at utilizing systems of symbolic communication (such as language and art) for self-expression and the exchange of ideas, and for organizing themselves into purposeful groups. Humans create complex social structures composed of many cooperating and competing groups, from families and kinship networks to political states.
Social interactions between humans have established an extremely wide variety of values, social norms, and rituals, which together form the basis of human society. Curiosity and the human desire to understand and influence the environment and to explain and manipulate phenomena (or events) has provided the foundation for developing science, philosophy, mythology, religion, anthropology, and numerous other fields of knowledge.
Though most of human existence has been sustained by hunting and gathering in band societies, increasing numbers of human societies began to practice sedentary agriculture approximately some 10,000 years ago, domesticating plants and animals, thus allowing for the growth of civilization. These human societies subsequently expanded in size, establishing various forms of government, religion, and culture around the world, unifying people within regions to form states and empires.
The rapid advancement of scientific and medical understanding in the 19th and 20th centuries led to the development of fuel-driven technologies and increased lifespans, causing the human population to rise exponentially. Today the global human population is estimated by the United Nations to be near 7.5 billion.
In common usage, the word "human" generally refers to the only extant species of the genus Homo—anatomically and behaviorally modern Homo sapiens.
In scientific terms, the meanings of "hominid" and "hominin" have changed during the recent decades with advances in the discovery and study of the fossil ancestors of modern humans.
The previously clear boundary between humans and apes has blurred, resulting in now acknowledging the hominids as encompassing multiple species, and Homo and close relatives since the split from chimpanzees as the only hominins. There is also a distinction between anatomically modern humans and Archaic Homo sapiens, the earliest fossil members of the species.
The English adjective human is a Middle English term from Old French humain, ultimately from Latin hūmānus, the adjective form of homō "man." The word's use as a noun (with a plural: humans) dates to the 16th century. The native English term man can refer to the species generally (a synonym for humanity), and could formerly refer to specific individuals of either sex, though this latter use is now obsolete.
The species binomial Homo sapiens was coined by Carl Linnaeus in his 18th century work Systema Naturae. The generic name Homo is a learned 18th century derivation from Latin homō "man," ultimately "earthly being". The species-name sapiens means "wise" or "sapient." Note that the Latin word homo refers to humans of either gender, and that sapiens is the singular form (while there is no such word as sapien).
Evolution and range -- Main article: Human evolution
Further information: Anthropology, Homo (genus), and Timeline of human evolution.
The genus Homo evolved and diverged from other hominins in Africa, after the human clade split from the chimpanzee lineage of the hominids (great apes) branch of the primates.
Modern humans, defined as the species Homo sapiens or specifically to the single extant subspecies Homo sapiens sapiens, proceeded to colonize all the continents and larger islands, arriving in Eurasia 125,000–60,000 years ago, Australia around 40,000 years ago, the Americas around 15,000 years ago, and remote islands such as Hawaii, Easter Island, Madagascar, and New Zealand between the years 300 and 1280.
The closest living relatives of humans are chimpanzees (genus Pan) and gorillas (genus Gorilla). With the sequencing of both the human and chimpanzee genome, current estimates of similarity between human and chimpanzee DNA sequences range between 95% and 99%.
By using the technique called a molecular clock which estimates the time required for the number of divergent mutations to accumulate between two lineages, the approximate date for the split between lineages can be calculated. The gibbons (family Hylobatidae) and orangutans (genus Pongo) were the first groups to split from the line leading to the humans, then gorillas (genus Gorilla) followed by the chimpanzees (genus Pan).
The splitting date between human and chimpanzee lineages is placed around 4–8 million years ago during the late Miocene epoch. During this split, chromosome 2 was formed from two other chromosomes, leaving humans with only 23 pairs of chromosomes, compared to 24 for the other apes.
Evidence from the fossil record: There is little fossil evidence for the divergence of the gorilla, chimpanzee and hominin lineages. The earliest fossils that have been proposed as members of the hominin lineage are Sahelanthropus tchadensis dating from 7 million years ago, Orrorin tugenensis dating from 5.7 million years ago, and Ardipithecus kadabba dating to 5.6 million years ago.
Each of these species has been argued to be a bipedal ancestor of later hominins, but all such claims are contested. It is also possible that any one of the three is an ancestor of another branch of African apes, or is an ancestor shared between hominins and other African Hominoidea (apes).
The question of the relation between these early fossil species and the hominin lineage is still to be resolved. From these early species the australopithecines arose around 4 million years ago diverged into robust (also called Paranthropus) and gracile branches, possibly one of which (such as A. garhi, dating to 2.5 million years ago) is a direct ancestor of the genus Homo.
The earliest members of the genus Homo are Homo habilis which evolved around 2.8 million years ago. Homo habilis has been considered the first species for which there is clear evidence of the use of stone tools. More recently, however, in 2015, stone tools, perhaps predating Homo habilis, have been discovered in northwestern Kenya that have been dated to 3.3 million years old. Nonetheless, the brains of Homo habilis were about the same size as that of a chimpanzee, and their main adaptation was bipedalism as an adaptation to terrestrial living.
During the next million years a process of encephalization began, and with the arrival of Homo erectus in the fossil record, cranial capacity had doubled. Homo erectus were the first of the hominina to leave Africa, and these species spread through Africa, Asia, and Europe between 1.3 to 1.8 million years ago.
One population of H. erectus, also sometimes classified as a separate species Homo ergaster, stayed in Africa and evolved into Homo sapiens. It is believed that these species were the first to use fire and complex tools.
The earliest transitional fossils between H. ergaster/erectus and archaic humans are from Africa such as Homo rhodesiensis, but seemingly transitional forms are also found at Dmanisi, Georgia. These descendants of African H. erectus spread through Eurasia from ca. 500,000 years ago evolving into H. antecessor, H. heidelbergensis and H. neanderthalensis.
The earliest fossils of anatomically modern humans are from the Middle Paleolithic, about 200,000 years ago such as the Omo remains of Ethiopia and the fossils of Herto sometimes classified as Homo sapiens idaltu. Later fossils of archaic Homo sapiens from Skhul in Israel and Southern Europe begin around 90,000 years ago.
Anatomical adaptations:
Human evolution is characterized by a number of morphological, developmental, physiological, and behavioral changes that have taken place since the split between the last common ancestor of humans and chimpanzees.
The most significant of these adaptations are 1. bipedalism, 2. increased brain size, 3. lengthened ontogeny (gestation and infancy), 4. decreased sexual dimorphism (neoteny). The relationship between all these changes is the subject of ongoing debate. Other significant morphological changes included the evolution of a power and precision grip, a change first occurring in H. erectus.
Bipedalism is the basic adaption of the hominin line, and it is considered the main cause behind a suite of skeletal changes shared by all bipedal hominins. The earliest bipedal hominin is considered to be either Sahelanthropus or Orrorin, with Ardipithecus, a full bipedal, coming somewhat later.
The knuckle walkers, the gorilla and chimpanzee, diverged around the same time, and either Sahelanthropus or Orrorin may be humans' last shared ancestor with those animals. The early bipedals eventually evolved into the australopithecines and later the genus Homo.
There are several theories of the adaptational value of bipedalism. It is possible that bipedalism was favored because it freed up the hands for reaching and carrying food, because it saved energy during locomotion, because it enabled long distance running and hunting, or as a strategy for avoiding hyperthermia by reducing the surface exposed to direct sun.
The human species developed a much larger brain than that of other primates—typically 1,330 cm3 (81 cu in) in modern humans, over twice the size of that of a chimpanzee or gorilla.
The pattern of encephalization started with Homo habilis which at approximately 600 cm3 (37 cu in) had a brain slightly larger than chimpanzees, and continued with Homo erectus (800–1,100 cm3 (49–67 cu in)), and reached a maximum in Neanderthals with an average size of 1,200–1,900 cm3 (73–116 cu in), larger even than Homo sapiens (but less encephalized).
The pattern of human postnatal brain growth differs from that of other apes (heterochrony), and allows for extended periods of social learning and language acquisition in juvenile humans. However, the differences between the structure of human brains and those of other apes may be even more significant than differences in size.
The increase in volume over time has affected different areas within the brain unequally – the temporal lobes, which contain centers for language processing have increased disproportionately, as has the prefrontal cortex which has been related to complex decision making and moderating social behavior.
Encephalization has been tied to an increasing emphasis on meat in the diet, or with the development of cooking, and it has been proposed that intelligence increased as a response to an increased necessity for solving social problems as human society became more complex.
The reduced degree of sexual dimorphism is primarily visible in the reduction of the male canine tooth relative to other ape species (except gibbons). Another important physiological change related to sexuality in humans was the evolution of hidden estrus.
Humans are the only ape in which the female is fertile year round, and in which no special signals of fertility are produced by the body (such as genital swelling during estrus). Nonetheless humans retain a degree of sexual dimorphism in the distribution of body hair and subcutaneous fat, and in the overall size, males being around 25% larger than females.
These changes taken together have been interpreted as a result of an increased emphasis on pair bonding as a possible solution to the requirement for increased parental investment due to the prolonged infancy of offspring.
Rise of Homo sapiens:
Further information:
- Recent African origin of modern humans,
- Multiregional origin of modern humans,
- Anatomically modern humans,
- Archaic human admixture with modern humans,
- and Early human migrations
World map of early human migrations according to mitochondrial population genetics (numbers are millennia before present, the North Pole is at the center).
By the beginning of the Upper Paleolithic period (50,000 BP), full behavioral modernity, including language, music and other cultural universals had developed. As modern humans spread out from Africa they encountered other hominids such as Homo neanderthalensis and the so-called Denisovans.
The nature of interaction between early humans and these sister species has been a long-standing source of controversy, the question being whether humans replaced these earlier species or whether they were in fact similar enough to interbreed, in which case these earlier populations may have contributed genetic material to modern humans.
Recent studies of the human and Neanderthal genomes suggest gene flow between archaic Homo sapiens and Neanderthals and Denisovans. In March 2016, studies were published that suggest that modern humans bred with hominins, including Denisovans and Neanderthals, on multiple occasions.
This dispersal out of Africa is estimated to have begun about 70,000 years BP from Northeast Africa. Current evidence suggests that there was only one such dispersal and that it only involved a few hundred individuals. The vast majority of humans stayed in Africa and adapted to a diverse array of environments. Modern humans subsequently spread globally, replacing earlier hominins (either through competition or hybridization). They inhabited Eurasia and Oceania by 40,000 years BP, and the Americas at least 14,500 years BP.
Transition to civilization:
Main articles: Neolithic Revolution and Cradle of civilization
Further information: History of the world
The rise of agriculture, and domestication of animals, led to stable human settlements. Until about 10,000 years ago, humans lived as hunter-gatherers. They gradually gained domination over much of the natural environment. They generally lived in small nomadic groups known as band societies, often in caves.
The advent of agriculture prompted the Neolithic Revolution, when access to food surplus led to the formation of permanent human settlements, the domestication of animals and the use of metal tools for the first time in history. Agriculture encouraged trade and cooperation, and led to complex society.
The early civilizations of Mesopotamia, Egypt, India, China, Maya, Greece and Rome were some of the cradles of civilization. The Late Middle Ages and the Early Modern Period saw the rise of revolutionary ideas and technologies.
Over the next 500 years, exploration and European colonialism brought great parts of the world under European control, leading to later struggles for independence. The concept of the modern world as distinct from an ancient world is based on a rapid change progress in a brief period of time in many areas.
Advances in all areas of human activity prompted new theories such as evolution and psychoanalysis, which changed humanity's views of itself.
The Scientific Revolution, Technological Revolution and the Industrial Revolution up until the 19th century resulted in independent discoveries such as imaging technology, major innovations in transport, such as the airplane and automobile; energy development, such as coal and electricity. This correlates with population growth (especially in America) and higher life expectancy, the World population rapidly increased numerous times in the 19th and 20th centuries as nearly 10% of the 100 billion people lived in the past century.
With the advent of the Information Age at the end of the 20th century, modern humans live in a world that has become increasingly globalized and interconnected. As of 2010, almost 2 billion humans are able to communicate with each other via the Internet, and 3.3 billion by mobile phone subscriptions.
Although interconnection between humans has encouraged the growth of science, art, discussion, and technology, it has also led to culture clashes and the development and use of weapons of mass destruction.
Human civilization has led to environmental destruction and pollution significantly contributing to the ongoing mass extinction of other forms of life called the Holocene extinction event, which may be further accelerated by global warming in the future.
Click on any of the following for more about Human Beings:
- Habitat and population
- Biology
- Psychology
- Behavior
- See also:
- Holocene calendar
- Human impact on the environment
- Dawn of Humanity – a 2015 PBS film
- Human timeline
- Life timeline
- List of human evolution fossils
- Nature timeline
- Archaeology Info
- Homo sapiens – The Smithsonian Institution's Human Origins Program
- Homo sapiens Linnaeus, 1758 at the Encyclopedia of Life
- View the human genome on Ensembl
- Human Timeline (Interactive) – Smithsonian, National Museum of Natural History (August 2016).
Timeline of human evolution
The timeline of human evolution outlines the major events in the development of the human species, Homo sapiens, and the evolution of our ancestors. It includes brief explanations of some of the species, genera, and the higher ranks of taxa that are seen today as possible ancestors of modern humans.
This timeline is based on studies from anthropology, paleontology, developmental biology, morphology, and from anatomical and genetic data. It does not address the origin of life, which discussion is provided by abiogenesis, but presents one possible line of evolutionary descent of species that eventually led to humans.
Click on any of the following blue hyperlinks for more about the Timeline of Human Evolution:
- Taxonomy of Homo sapiens
- Timeline
- See also:
- Chimpanzee-human last common ancestor
- Dawn of Humanity (film)
- Homininae
- Human evolution
- Human taxonomy
- Human timeline
- Homo
- Life timeline
- Most recent common ancestor
- List of human evolution fossils
- March of Progress – famous illustration of 25 million years of human evolution
- Nature timeline
- Prehistoric amphibian
- Prehistoric Autopsy
- Prehistoric fish
- Prehistoric reptile
- The Ancestor's Tale by Richard Dawkins – timeline comprising 40 rendezvous points
- Timeline of evolution – explains the evolution of animals living today
- Timeline of prehistory
- Y-DNA haplogroups by ethnic groups
- General:
- Palaeos
- Berkeley Evolution
- History of Animal Evolution
- Tree of Life Web Project – explore complete phylogenetic tree interactively
- Human Timeline (Interactive) – Smithsonian, National Museum of Natural History (August 2016).
Evolution including Today's Homo SapiensPictured: The seven stages of the evolution of Mankind
Evolution is change in the heritable characteristics of biological populations over successive generations.
Evolutionary processes give rise to biodiversity at every level of biological organization, including the levels of species, individual organisms, and molecules.
Repeated formation of new species (speciation), change within species (anagenesis), and loss of species (extinction) throughout the evolutionary history of life on Earth are demonstrated by shared sets of morphological and biochemical traits, including shared DNA sequences.
These shared traits are more similar among species that share a more recent common ancestor, and can be used to reconstruct a biological "tree of life" based on evolutionary relationships (phylogenetics), using both existing species and fossils. The fossil record includes a progression from early biogenic graphite, to microbial mat fossils, to fossilized multicellular organisms. Existing patterns of biodiversity have been shaped both by speciation and by extinction.
In the mid-19th century, Charles Darwin formulated the scientific theory of evolution by natural selection, published in his book On the Origin of Species (1859). Evolution by natural selection is a process demonstrated by the observation that more offspring are produced than can possibly survive, along with three facts about populations:
This teleonomy is the quality whereby the process of natural selection creates and preserves traits that are seemingly fitted for the functional roles they perform. The processes by which the changes occur, from one generation to another, are called evolutionary processes or mechanisms.
The four most widely recognized evolutionary processes are: natural selection (including sexual selection), genetic drift, mutation and gene migration due to genetic admixture. Natural selection and genetic drift sort variation; mutation and gene migration create variation.
Consequences of selection can include meiotic drive (unequal transmission of certain alleles), nonrandom mating and genetic hitchhiking.
In the early 20th century the modern evolutionary synthesis integrated classical genetics with Darwin's theory of evolution by natural selection through the discipline of population genetics.
The importance of natural selection as a cause of evolution was accepted into other branches of biology. Moreover, previously held notions about evolution, such as orthogenesis, evolutionism, and other beliefs about innate "progress" within the largest-scale trends in evolution, became obsolete.
Scientists continue to study various aspects of evolutionary biology by forming and testing hypotheses, constructing mathematical models of theoretical biology and biological theories, using observational data, and performing experiments in both the field and the laboratory.
All life on Earth shares a common ancestor known as the last universal common ancestor (LUCA), which lived approximately 3.5–3.8 billion years ago. This should not be assumed to be the first living organism on Earth; a study in 2015 found "remains of biotic life" from 4.1 billion years ago in ancient rocks in Western Australia.
In July 2016, scientists reported identifying a set of 355 genes from the LUCA of all organisms living on Earth. More than 99 percent of all species that ever lived on Earth are estimated to be extinct.
Estimates of Earth's current species range from 10 to 14 million, of which about 1.9 million are estimated to have been named and 1.6 million documented in a central database to date.
More recently, in May 2016, scientists reported that 1 trillion species are estimated to be on Earth currently with only one-thousandth of one percent described.
In terms of practical application, an understanding of evolution has been instrumental to developments in numerous scientific and industrial fields, including agriculture, human and veterinary medicine, and the life sciences in general.
Discoveries in evolutionary biology have made a significant impact not just in the traditional branches of biology but also in other academic disciplines, including biological anthropology, and evolutionary psychology. Evolutionary computation, a sub-field of artificial intelligence, involves the application of Darwinian principles to problems in computer science.
Click on any of the following blue hyperlinks for more about Evolution:
Today's Homo Sapiens:
Homo sapiens is the only extant human species. The name is Latin for "wise man" and was introduced in 1758 by Carl Linnaeus (who is himself the lectotype for the species).
Extinct species of the genus Homo include Homo erectus, extant during roughly 1.9 to 0.4 million years ago, and a number of other species (by some authors considered subspecies of either H. sapiens or H. erectus).
The age of speciation of H. sapiens out of ancestral H. erectus (or an intermediate species such as Homo antecessor) is estimated to have been roughly 350,000 years ago. Sustained archaic admixture is known to have taken place both in Africa and (following the recent Out-Of-Africa expansion) in Eurasia, between about 100,000 and 30,000 years ago.
The term anatomically modern humans (AMH) is used to distinguish H. sapiens having an anatomy consistent with the range of phenotypes seen in contemporary humans from varieties of extinct archaic humans. This is useful especially for times and regions where anatomically modern and archaic humans co-existed, for example, in Paleolithic Europe.
By the early 2000s, it had become common to use H. s. sapiens for the ancestral population of all contemporary humans, and as such it is equivalent to the binomial H. sapiens in the more restrictive sense (considering H. neanderthalensis a separate species).
Click on any of the following blue hyperlinks for more about Homo Sapiens:
Evolutionary processes give rise to biodiversity at every level of biological organization, including the levels of species, individual organisms, and molecules.
Repeated formation of new species (speciation), change within species (anagenesis), and loss of species (extinction) throughout the evolutionary history of life on Earth are demonstrated by shared sets of morphological and biochemical traits, including shared DNA sequences.
These shared traits are more similar among species that share a more recent common ancestor, and can be used to reconstruct a biological "tree of life" based on evolutionary relationships (phylogenetics), using both existing species and fossils. The fossil record includes a progression from early biogenic graphite, to microbial mat fossils, to fossilized multicellular organisms. Existing patterns of biodiversity have been shaped both by speciation and by extinction.
In the mid-19th century, Charles Darwin formulated the scientific theory of evolution by natural selection, published in his book On the Origin of Species (1859). Evolution by natural selection is a process demonstrated by the observation that more offspring are produced than can possibly survive, along with three facts about populations:
- traits vary among individuals with respect to morphology, physiology, and behavior (phenotypic variation),
- different traits confer different rates of survival and reproduction (differential fitness),
- traits can be passed from generation to generation (heritability of fitness). Thus, in successive generations members of a population are replaced by progeny of parents better adapted to survive and reproduce in the biophysical environment in which natural selection takes place.
This teleonomy is the quality whereby the process of natural selection creates and preserves traits that are seemingly fitted for the functional roles they perform. The processes by which the changes occur, from one generation to another, are called evolutionary processes or mechanisms.
The four most widely recognized evolutionary processes are: natural selection (including sexual selection), genetic drift, mutation and gene migration due to genetic admixture. Natural selection and genetic drift sort variation; mutation and gene migration create variation.
Consequences of selection can include meiotic drive (unequal transmission of certain alleles), nonrandom mating and genetic hitchhiking.
In the early 20th century the modern evolutionary synthesis integrated classical genetics with Darwin's theory of evolution by natural selection through the discipline of population genetics.
The importance of natural selection as a cause of evolution was accepted into other branches of biology. Moreover, previously held notions about evolution, such as orthogenesis, evolutionism, and other beliefs about innate "progress" within the largest-scale trends in evolution, became obsolete.
Scientists continue to study various aspects of evolutionary biology by forming and testing hypotheses, constructing mathematical models of theoretical biology and biological theories, using observational data, and performing experiments in both the field and the laboratory.
All life on Earth shares a common ancestor known as the last universal common ancestor (LUCA), which lived approximately 3.5–3.8 billion years ago. This should not be assumed to be the first living organism on Earth; a study in 2015 found "remains of biotic life" from 4.1 billion years ago in ancient rocks in Western Australia.
In July 2016, scientists reported identifying a set of 355 genes from the LUCA of all organisms living on Earth. More than 99 percent of all species that ever lived on Earth are estimated to be extinct.
Estimates of Earth's current species range from 10 to 14 million, of which about 1.9 million are estimated to have been named and 1.6 million documented in a central database to date.
More recently, in May 2016, scientists reported that 1 trillion species are estimated to be on Earth currently with only one-thousandth of one percent described.
In terms of practical application, an understanding of evolution has been instrumental to developments in numerous scientific and industrial fields, including agriculture, human and veterinary medicine, and the life sciences in general.
Discoveries in evolutionary biology have made a significant impact not just in the traditional branches of biology but also in other academic disciplines, including biological anthropology, and evolutionary psychology. Evolutionary computation, a sub-field of artificial intelligence, involves the application of Darwinian principles to problems in computer science.
Click on any of the following blue hyperlinks for more about Evolution:
- History of evolutionary thought
- Heredity
- Variation
- Means:
- Outcomes
- Evolutionary history of life
- Applications
- Social and cultural responses
- See also:
- Argument from poor design
- Biocultural evolution
- Biological classification
- Evidence of common descent
- Evolutionary anthropology
- Evolutionary ecology
- Evolutionary epistemology
- Evolutionary neuroscience
- Evolution of biological complexity
- Evolution of plants
- Timeline of the evolutionary history of life
- Unintelligent design
- Universal Darwinism
- General information:
- Evolution on In Our Time at the BBC.
- "Evolution". New Scientist. Retrieved 2011-05-30.
- "Evolution Resources from the National Academies". Washington, D.C.: National Academy of Sciences. Retrieved 2011-05-30.
- "Understanding Evolution: your one-stop resource for information on evolution". University of California, Berkeley. Retrieved 2011-05-30.
- "Evolution of Evolution – 150 Years of Darwin's 'On the Origin of Species'". Arlington County, VA: National Science Foundation. Retrieved 2011-05-30.
- Human Timeline (Interactive) – Smithsonian, National Museum of Natural History (August 2016).
- Experiments concerning the process of biological evolution
- Lenski, Richard E. "Experimental Evolution". Michigan State University. Retrieved 2013-07-31.
- Chastain, Erick; Livnat, Adi; Papadimitriou, Christos; Vazirani, Umesh (July 22, 2014). "Algorithms, games, and evolution". Proc. Natl. Acad. Sci. U.S.A. Washington, D.C.: National Academy of Sciences. 111 (29): 10620–10623. Bibcode:2014PNAS..11110620C. ISSN 0027-8424. doi:10.1073/pnas.1406556111. Retrieved 2015-01-03.
- Online lectures
- Stearns, Stephen C. "Principles of Evolution, Ecology and Behavior". Retrieved 2011-08-30
Today's Homo Sapiens:
Homo sapiens is the only extant human species. The name is Latin for "wise man" and was introduced in 1758 by Carl Linnaeus (who is himself the lectotype for the species).
Extinct species of the genus Homo include Homo erectus, extant during roughly 1.9 to 0.4 million years ago, and a number of other species (by some authors considered subspecies of either H. sapiens or H. erectus).
The age of speciation of H. sapiens out of ancestral H. erectus (or an intermediate species such as Homo antecessor) is estimated to have been roughly 350,000 years ago. Sustained archaic admixture is known to have taken place both in Africa and (following the recent Out-Of-Africa expansion) in Eurasia, between about 100,000 and 30,000 years ago.
The term anatomically modern humans (AMH) is used to distinguish H. sapiens having an anatomy consistent with the range of phenotypes seen in contemporary humans from varieties of extinct archaic humans. This is useful especially for times and regions where anatomically modern and archaic humans co-existed, for example, in Paleolithic Europe.
By the early 2000s, it had become common to use H. s. sapiens for the ancestral population of all contemporary humans, and as such it is equivalent to the binomial H. sapiens in the more restrictive sense (considering H. neanderthalensis a separate species).
Click on any of the following blue hyperlinks for more about Homo Sapiens:
- Name and taxonomy
- Age and speciation process
- Dispersal and archaic admixture
- Anatomy
- Recent evolution
- Behavioral modernity
- Human Timeline (Interactive) – Smithsonian, National Museum of Natural History (August 2016).
Adult Development
YouTube Video About Adult Development
YouTube Video: 7 Adulthood Myths You (Probably) Believe!
Adult development encompasses the changes that occur in biological and psychological domains of human life from the end of adolescence until the end of one's life. These changes may be gradual or rapid, and can reflect positive, negative, or no change from previous levels of functioning.
Changes occur at the cellular level and are partially explained by biological theories of adult development and aging.[1] Biological changes influence psychological and interpersonal/social developmental changes, which are often described by stage theories of human development. Stage theories typically focus on “age-appropriate” developmental tasks to be achieved at each stage.
Erik Erikson and Carl Jung proposed stage theories of human development that encompass the entire life span, and emphasized the potential for positive change very late in life.
The concept of adulthood has legal and socio-cultural definitions. The legal definition of an adult is a person who has reached the age at which they are considered responsible for their own actions, and therefore legally accountable for them. This is referred to as the age of majority, which is age 18 in most cultures, although there is variation from 16 to 21.
The socio-cultural definition of being an adult is based on what a culture normally views as being the required criteria for adulthood, which in turn influences the life of individuals within that culture. This may or may not coincide with the legal definition. Current views on adult development in late life focus on the concept of successful aging, defined as “...low probability of disease and disease-related disability, high cognitive and physical functional capacity, and active engagement with life.”
Biomedical theories hold that one can age successfully by caring for physical health and minimizing loss in function, whereas psychosocial theories posit that capitalizing upon social and cognitive resources, such as a positive attitude or social support from neighbors and friends, is key to aging successfully.
Jeanne Louise Calment exemplifies successful aging as the longest living person, dying at the age of 122 years. Her long life can be attributed to her genetics (both parents lived into their 80s) and her active lifestyle and optimistic attitude. She enjoyed many hobbies and physical activities and believed that laughter contributed to her longevity. She poured olive oil on all of her food and skin, which she believed also contributed to her long life and youthful appearance.
Click on any of the following blue hyperlinks for more about Adult Development:
Changes occur at the cellular level and are partially explained by biological theories of adult development and aging.[1] Biological changes influence psychological and interpersonal/social developmental changes, which are often described by stage theories of human development. Stage theories typically focus on “age-appropriate” developmental tasks to be achieved at each stage.
Erik Erikson and Carl Jung proposed stage theories of human development that encompass the entire life span, and emphasized the potential for positive change very late in life.
The concept of adulthood has legal and socio-cultural definitions. The legal definition of an adult is a person who has reached the age at which they are considered responsible for their own actions, and therefore legally accountable for them. This is referred to as the age of majority, which is age 18 in most cultures, although there is variation from 16 to 21.
The socio-cultural definition of being an adult is based on what a culture normally views as being the required criteria for adulthood, which in turn influences the life of individuals within that culture. This may or may not coincide with the legal definition. Current views on adult development in late life focus on the concept of successful aging, defined as “...low probability of disease and disease-related disability, high cognitive and physical functional capacity, and active engagement with life.”
Biomedical theories hold that one can age successfully by caring for physical health and minimizing loss in function, whereas psychosocial theories posit that capitalizing upon social and cognitive resources, such as a positive attitude or social support from neighbors and friends, is key to aging successfully.
Jeanne Louise Calment exemplifies successful aging as the longest living person, dying at the age of 122 years. Her long life can be attributed to her genetics (both parents lived into their 80s) and her active lifestyle and optimistic attitude. She enjoyed many hobbies and physical activities and believed that laughter contributed to her longevity. She poured olive oil on all of her food and skin, which she believed also contributed to her long life and youthful appearance.
Click on any of the following blue hyperlinks for more about Adult Development:
- Contemporary and classic theories
- Non-normative cognitive changes in adulthood
- Mental health in adulthood and old age
- Optimizing health and mental well-being in adulthood
- Personality in adulthood
- Intelligence in adulthood
- Relationships
- Retirement
- Long term care
Childhood
YouTube Video About Childhood Development Stages
Childhood is the age span ranging from birth to adolescence. According to Piaget's theory of cognitive development, childhood consists of two stages: preoperational stage and concrete operational stage.
In developmental psychology, childhood is divided up into the developmental stages of:
Various childhood factors could affect a person's attitude formation. The concept of childhood emerged during the 17th and 18th centuries, particularly through the educational theories of the philosopher John Locke and the growth of books for and about children. Previous to this point, children were often seen as incomplete versions of adults.
Time span, age ranges:
The term childhood is non-specific in its time span and can imply a varying range of years in human development. Developmentally and biologically, it refers to the period between infancy and adulthood.
In common terms, childhood is considered to start from birth, and as a concept of play and innocence, which ends at adolescence.
In the legal systems of many countries, there is an age of majority when childhood legally ends and a person legally becomes an adult, which ranges anywhere from 15 to 21, with 18 being the most common.
A global consensus on the terms of childhood is the Convention on the Rights of the Child (CRC). Childhood expectancy indicates the time span, which a child has to experience childhood.
Eight life events ending childhood have been described as death, extreme malnourishment, extreme violence, conflict forcing displacement, children being out of school, child labor, children having children and child marriage.
Developmental stages of childhood:
Early Childhood:
Early childhood follows the infancy stage and begins with toddlerhood when the child begins speaking or taking steps independently.
While toddlerhood ends around age three when the child becomes less dependent on parental assistance for basic needs, early childhood continues approximately through age nine.
According to the National Association for the Education of Young Children, early childhood spans the human life from birth to age eight. At this stage children are learning through observing, experimenting and communicating with others. Adults supervise and support the development process of the child, which then will lead to the child's autonomy. Also during this stage, a strong emotional bond is created between the child and the care providers. The children also start to begin kindergarten at this age to start their social lives.
Middle Childhood:
Middle childhood begins at around age ten approximating primary school age. It ends around puberty, which typically marks the beginning of adolescence. In this period, children are attending school, thus developing socially and mentally. They are at a stage where they make new friends and gain new skills, which will enable them to become more independent and enhance their individuality.
Adolescence:
Adolescence is usually determined by the onset of puberty, usually 12 for girls and 13 for boys. However, puberty may also begin in preadolescence. Adolescence is biological distinct from childhood, but it is accepted by some cultures as a part of social childhood, because most of them are minors.
The onset of adolescence brings about various physical, psychological and behavioral changes. The end of adolescence and the beginning of adulthood varies by country and by function, and even within a single nation-state or culture there may be different ages at which an individual is considered to be mature enough to be entrusted by society with certain tasks.
Click on any of the following blue hyperlinks for more about Childhood:
In developmental psychology, childhood is divided up into the developmental stages of:
- infancy and toddlerhood (learning to walk and talk, ages birth to 4),
- early childhood (play age covering the kindergarten and early grade school years up to grade 4 (5-10 years old)),
- preadolescence, around 11 and 12 (where puberty could possibly begin in early developers but could equally be in early childhood if the child has not reached puberty),
- and adolescence (puberty (early adolescence, 13-15) through post-puberty (late adolescence, 16-19) ).
Various childhood factors could affect a person's attitude formation. The concept of childhood emerged during the 17th and 18th centuries, particularly through the educational theories of the philosopher John Locke and the growth of books for and about children. Previous to this point, children were often seen as incomplete versions of adults.
Time span, age ranges:
The term childhood is non-specific in its time span and can imply a varying range of years in human development. Developmentally and biologically, it refers to the period between infancy and adulthood.
In common terms, childhood is considered to start from birth, and as a concept of play and innocence, which ends at adolescence.
In the legal systems of many countries, there is an age of majority when childhood legally ends and a person legally becomes an adult, which ranges anywhere from 15 to 21, with 18 being the most common.
A global consensus on the terms of childhood is the Convention on the Rights of the Child (CRC). Childhood expectancy indicates the time span, which a child has to experience childhood.
Eight life events ending childhood have been described as death, extreme malnourishment, extreme violence, conflict forcing displacement, children being out of school, child labor, children having children and child marriage.
Developmental stages of childhood:
Early Childhood:
Early childhood follows the infancy stage and begins with toddlerhood when the child begins speaking or taking steps independently.
While toddlerhood ends around age three when the child becomes less dependent on parental assistance for basic needs, early childhood continues approximately through age nine.
According to the National Association for the Education of Young Children, early childhood spans the human life from birth to age eight. At this stage children are learning through observing, experimenting and communicating with others. Adults supervise and support the development process of the child, which then will lead to the child's autonomy. Also during this stage, a strong emotional bond is created between the child and the care providers. The children also start to begin kindergarten at this age to start their social lives.
Middle Childhood:
Middle childhood begins at around age ten approximating primary school age. It ends around puberty, which typically marks the beginning of adolescence. In this period, children are attending school, thus developing socially and mentally. They are at a stage where they make new friends and gain new skills, which will enable them to become more independent and enhance their individuality.
Adolescence:
Adolescence is usually determined by the onset of puberty, usually 12 for girls and 13 for boys. However, puberty may also begin in preadolescence. Adolescence is biological distinct from childhood, but it is accepted by some cultures as a part of social childhood, because most of them are minors.
The onset of adolescence brings about various physical, psychological and behavioral changes. The end of adolescence and the beginning of adulthood varies by country and by function, and even within a single nation-state or culture there may be different ages at which an individual is considered to be mature enough to be entrusted by society with certain tasks.
Click on any of the following blue hyperlinks for more about Childhood:
- History
- Modern concepts of childhood
- Geographies of childhood
- Nature deficit disorder
- Healthy childhoods
- Children's rights
- Research in social sciences
- See also:
- Birthday party
- Child
- Childhood and migration
- Childhood in Medieval England
- Children's party games
- Coming of age
- Developmental biology
- List of child related articles
- List of traditional children's games
- Rite of passage
- Sociology of childhood
- Street children
- Childhood on In Our Time at the BBC.
- World Childhood Foundation
- Meeting Early Childhood Needs
Demographics of the World
TOP Illustration: Population density (people per km2) by country, 2015 (Courtesy of Ms Sarah Welch - Own work, CC BY-SA 4.0,
BOTTOM Illustration: Life expectancy varies greatly from country to country. It is lowest in certain countries in Africa and higher in Japan, Australia and Spain. (Courtesy of Fobos92 - Own work, CC BY-SA 3.0)
- YouTube Video: World population expected to increase by two billion in 30 yrs - Press Conference (17 June 2019)
- YouTube Video: Empty Planet: Preparing for the Global Population Decline
- YouTube Video: Human Population Through Time
TOP Illustration: Population density (people per km2) by country, 2015 (Courtesy of Ms Sarah Welch - Own work, CC BY-SA 4.0,
BOTTOM Illustration: Life expectancy varies greatly from country to country. It is lowest in certain countries in Africa and higher in Japan, Australia and Spain. (Courtesy of Fobos92 - Own work, CC BY-SA 3.0)
Demographics of the world include the following:
The overall total population of the world is approximately 7.45 billion, as of July 2016.
Its overall population density is 50 people per km² (129.28 per sq. mile), excluding Antarctica.
Nearly two-thirds of the population lives in Asia and is predominantly urban and suburban, with more than 2.5 billion in the countries of China and India combined. The World's fairly low literacy rate (83.7%) is attributable to impoverished regions. Extremely low literacy rates are concentrated in three regions, the Arab states, South and West Asia, and Sub-Saharan Africa.
The world's largest ethnic group is Han Chinese with Mandarin being the world's most spoken language in terms of native speakers.
Human migration has been shifting toward cities and urban centers, with the urban population jumping from 29% in 1950, to 50.5% in 2005. Working backwards from the United Nations prediction that the world will be 51.3 percent urban by 2010, Dr. Ron Wimberley, Dr. Libby Morris and Dr. Gregory Fulkerson estimated May 23, 2007 to be the first time the urban population outnumbered the rural population in history.
China and India are the most populous countries, as the birth rate has consistently dropped in developed countries and until recently remained high in developing countries. Tokyo is the largest urban conglomeration in the world.
The total fertility rate of the World is estimated as 2.52 children per woman, which is above the replacement fertility rate of approximately 2.1. However, world population growth is unevenly distributed, going from .91 in Macau, to 7.68 in Niger. The United Nations estimated an annual population increase of 1.14% for the year of 2000.
There are approximately 3.38 billion females in the World. The number of males is about 3.41 billion.
People under 14 years of age made up over a quarter of the world population (26.3%), and people age 65 and over made up less than one-tenth (7.9%) in 2011.
The world population growth is approximately 1.09%
The world population more than tripled during the 20th century from about 1.65 billion in 1900 to 5.97 billion in 1999. It reached the 2 billion mark in 1927, the 3 billion mark in 1960, 4 billion in 1974, and 5 billion in 1987. Currently, population growth is fastest among low wealth, Third World countries.
The UN projects a world population of 9.15 billion in 2050, which is a 32.69% increase from 2010 (6.89 billion).
Click on any of the following blue hyperlinks for more about Demographics of the World:
- population density,
- ethnicity,
- education level,
- health measures,
- economic status,
- religious affiliations
- and other aspects of the population.
The overall total population of the world is approximately 7.45 billion, as of July 2016.
Its overall population density is 50 people per km² (129.28 per sq. mile), excluding Antarctica.
Nearly two-thirds of the population lives in Asia and is predominantly urban and suburban, with more than 2.5 billion in the countries of China and India combined. The World's fairly low literacy rate (83.7%) is attributable to impoverished regions. Extremely low literacy rates are concentrated in three regions, the Arab states, South and West Asia, and Sub-Saharan Africa.
The world's largest ethnic group is Han Chinese with Mandarin being the world's most spoken language in terms of native speakers.
Human migration has been shifting toward cities and urban centers, with the urban population jumping from 29% in 1950, to 50.5% in 2005. Working backwards from the United Nations prediction that the world will be 51.3 percent urban by 2010, Dr. Ron Wimberley, Dr. Libby Morris and Dr. Gregory Fulkerson estimated May 23, 2007 to be the first time the urban population outnumbered the rural population in history.
China and India are the most populous countries, as the birth rate has consistently dropped in developed countries and until recently remained high in developing countries. Tokyo is the largest urban conglomeration in the world.
The total fertility rate of the World is estimated as 2.52 children per woman, which is above the replacement fertility rate of approximately 2.1. However, world population growth is unevenly distributed, going from .91 in Macau, to 7.68 in Niger. The United Nations estimated an annual population increase of 1.14% for the year of 2000.
There are approximately 3.38 billion females in the World. The number of males is about 3.41 billion.
People under 14 years of age made up over a quarter of the world population (26.3%), and people age 65 and over made up less than one-tenth (7.9%) in 2011.
The world population growth is approximately 1.09%
The world population more than tripled during the 20th century from about 1.65 billion in 1900 to 5.97 billion in 1999. It reached the 2 billion mark in 1927, the 3 billion mark in 1960, 4 billion in 1974, and 5 billion in 1987. Currently, population growth is fastest among low wealth, Third World countries.
The UN projects a world population of 9.15 billion in 2050, which is a 32.69% increase from 2010 (6.89 billion).
Click on any of the following blue hyperlinks for more about Demographics of the World:
- History
- Cities
- Population density
- Population distribution
- Ethnicity
- Religion
- Marriage
- Health
- Demographic statistics
- Languages
- Education
- See also:
Middle Age
YouTube Video: 35 min. LOW IMPACT AEROBIC WORKOUT - Fun and easy to follow for beginners and seniors!
Middle age is the period of age beyond young adulthood but before the onset of old age.
According to the Oxford English Dictionary middle age is between 45 and 65: "The period between early adulthood and old age, usually considered as the years from about 45 to 65."
The US Census lists the category middle age from 45 to 65. Merriam-Webster list middle age from 45 to 64, while prominent psychologist Erik Erikson saw it starting a little earlier and defines middle adulthood as between 40 and 65.
The Collins English Dictionary, list it between the ages of 40 and 60. and the Diagnostic and Statistical Manual of Mental Disorders - the standard diagnostic manual of the American Psychiatric Association - used to define middle age as 40 to 60, but as of DSM-IV (1994) revised the definition upwards to 45 to 65.
Young Adulthood:
Further information: Young adult (psychology)
This time in the lifespan is considered to be the developmental stage of those who are between 20 years old and 40 years old. Recent developmental theories have recognized that development occurs across the entire life of a person as they experience changes cognitively, physically, socially, and in personality.
Middle Adulthood:
This time period in the life of a person can be referred to as middle age. This time span has been defined as the time between ages 45 to 65 years old.
Many changes may occur between young adulthood and this stage. The body may slow down and the middle aged might become more sensitive to diet, substance abuse, stress, and rest.
Chronic health problems can become an issue along with disability or disease.
Approximately one centimeter of height may be lost per decade. Emotional responses and retrospection vary from person to person. Experiencing a sense of mortality, sadness, or loss is common at this age.
Those in middle adulthood or middle age continue to develop relationships and adapt to the changes in relationships. Changes can be the interacting with growing and grown children and aging parents. Community involvement is fairly typical of this stage of adulthood,[8]as well as continued career development.
Physical Characteristics:
Middle-aged adults may begin to show visible signs of aging. This process can be more rapid in women who have osteoporosis. Changes might occur in the nervous system. The ability to perform complex tasks remains intact.
Women between 48 and 55 experience menopause, which ends natural fertility.
Menopause can have many side effects, some welcome and some not so welcome. Men may also experience physical changes. Changes can occur to skin and other changes may include decline in physical fitness including a reduction in aerobic performance and a decrease in maximal heart rate. These measurements are generalities and people may exhibit these changes at different rates and times.
The mortality rate can begin to increase from 45 and onwards, mainly due to health problems like heart problems, cancer, hypertension, and diabetes.
Still, the majority of middle-aged people in industrialized nations can expect to live into old age.
Cognitive characteristics
Erik Erikson refers to this period of adulthood as the generavative-versus-stagnation stage. Persons in middle adulthood or middle age may have some cognitive loss. This loss usually remains unnoticeable because life experiences and strategies are developed to compensate for any decrease in mental abilities.
Social and personality characteristics
Marital satisfaction remains but other family relationships can be more difficult. Career satisfaction focuses more on inner satisfaction and contentedness and less on ambition and the desire to 'advance'.
Even so, career changes often can occur. Middle adulthood or middle age can be a time when a person re-examines their life by taking stock, and evaluating their accomplishments.
Morality may change and become more conscious. The perception that those in this stage of development or life undergo a 'mid-life' crisis is largely false. This period in life is usually satisfying, tranquil.
Personality characteristics remain stable throughout this period. This may make the issue of mortality irrefutable. The relationships in middle adulthood may continue to evolve into connections that are stable.
See also:
According to the Oxford English Dictionary middle age is between 45 and 65: "The period between early adulthood and old age, usually considered as the years from about 45 to 65."
The US Census lists the category middle age from 45 to 65. Merriam-Webster list middle age from 45 to 64, while prominent psychologist Erik Erikson saw it starting a little earlier and defines middle adulthood as between 40 and 65.
The Collins English Dictionary, list it between the ages of 40 and 60. and the Diagnostic and Statistical Manual of Mental Disorders - the standard diagnostic manual of the American Psychiatric Association - used to define middle age as 40 to 60, but as of DSM-IV (1994) revised the definition upwards to 45 to 65.
Young Adulthood:
Further information: Young adult (psychology)
This time in the lifespan is considered to be the developmental stage of those who are between 20 years old and 40 years old. Recent developmental theories have recognized that development occurs across the entire life of a person as they experience changes cognitively, physically, socially, and in personality.
Middle Adulthood:
This time period in the life of a person can be referred to as middle age. This time span has been defined as the time between ages 45 to 65 years old.
Many changes may occur between young adulthood and this stage. The body may slow down and the middle aged might become more sensitive to diet, substance abuse, stress, and rest.
Chronic health problems can become an issue along with disability or disease.
Approximately one centimeter of height may be lost per decade. Emotional responses and retrospection vary from person to person. Experiencing a sense of mortality, sadness, or loss is common at this age.
Those in middle adulthood or middle age continue to develop relationships and adapt to the changes in relationships. Changes can be the interacting with growing and grown children and aging parents. Community involvement is fairly typical of this stage of adulthood,[8]as well as continued career development.
Physical Characteristics:
Middle-aged adults may begin to show visible signs of aging. This process can be more rapid in women who have osteoporosis. Changes might occur in the nervous system. The ability to perform complex tasks remains intact.
Women between 48 and 55 experience menopause, which ends natural fertility.
Menopause can have many side effects, some welcome and some not so welcome. Men may also experience physical changes. Changes can occur to skin and other changes may include decline in physical fitness including a reduction in aerobic performance and a decrease in maximal heart rate. These measurements are generalities and people may exhibit these changes at different rates and times.
The mortality rate can begin to increase from 45 and onwards, mainly due to health problems like heart problems, cancer, hypertension, and diabetes.
Still, the majority of middle-aged people in industrialized nations can expect to live into old age.
Cognitive characteristics
Erik Erikson refers to this period of adulthood as the generavative-versus-stagnation stage. Persons in middle adulthood or middle age may have some cognitive loss. This loss usually remains unnoticeable because life experiences and strategies are developed to compensate for any decrease in mental abilities.
Social and personality characteristics
Marital satisfaction remains but other family relationships can be more difficult. Career satisfaction focuses more on inner satisfaction and contentedness and less on ambition and the desire to 'advance'.
Even so, career changes often can occur. Middle adulthood or middle age can be a time when a person re-examines their life by taking stock, and evaluating their accomplishments.
Morality may change and become more conscious. The perception that those in this stage of development or life undergo a 'mid-life' crisis is largely false. This period in life is usually satisfying, tranquil.
Personality characteristics remain stable throughout this period. This may make the issue of mortality irrefutable. The relationships in middle adulthood may continue to evolve into connections that are stable.
See also:
Old Age (see also the web page Senior Living)
YouTube Video by Jane Fonda: Walking Cardio Workout : Level 2
Old age refers to ages nearing or surpassing the life expectancy of human beings, and is thus the end of the human life cycle. Terms and euphemisms for old people include, old people (worldwide usage), seniors (American usage), senior citizens (British and American usage), older adults (in the social sciences), the elderly, and elders (in many cultures—including the cultures of aboriginal people).
Old people often have limited regenerative abilities and are more susceptible to disease, syndromes, and sickness than younger adults. The organic process of ageing is called senescence, the medical study of the aging process is called gerontology, and the study of diseases that afflict the elderly is called geriatrics. The elderly also face other social issues around retirement, loneliness, and ageism.
Old age is a social construct rather than a definite biological stage, and the chronological age denoted as "old age" varies culturally and historically.
In 2011, the United Nations proposed a human rights convention that would specifically protect older persons.
Definitions of old age include official definitions, sub-group definitions, and four dimensions as follows.
Official Definitions:
Old age comprises "the later part of life; the period of life after youth and middle age . . ., usually with reference to deterioration". At what age old age begins cannot be universally defined because it differs according to the context.
Most developed-world countries have accepted the chronological age of 65 years as a definition of 'elderly' or older person. The United Nations has agreed that 60+ years may be usually denoted as old age and this is the first attempt at an international definition of old age.
However, for its study of old age in Africa, the World Health Organization (WHO) set 50 as the beginning of old age. At the same time, the WHO recognized that the developing world often defines old age, not by years, but by new roles, loss of previous roles, or inability to make active contributions to society.
Most developed Western countries set the age of 60 to 65 for retirement. Being 60–65 years old is usually a requirement for becoming eligible for senior social programs. However, various countries and societies consider the onset of old age as anywhere from the mid-40s to the 70s.
The definitions of old age continue to change especially as life expectancy in developed countries has risen to beyond 80 years old. In October 2016, a paper published in the science journal Nature presented the conclusion that the maximum human lifespan is an average age of 115, with an absolute upper limit of 125 years. However, the authors' methods and conclusions drew criticism from the scientific community, who concluded that the study was flawed.
Sub-group definitions:
Gerontologists have recognized the very different conditions that people experience as they grow older within the years defined as old age. In developed countries, most people in their 60s and early 70s are still fit, active, and able to care for themselves. However, after 75, they will become increasingly frail, a condition marked by serious mental and physical debilitation.
Therefore, rather than lumping together all people who have been defined as old, some gerontologists have recognized the diversity of old age by defining sub-groups. One study distinguishes the young old (60 to 69), the middle old (70 to 79), and the very old (80+).
Another study's sub-grouping is young-old (65 to 74), middle-old (75–84), and oldest-old (85+). A third sub-grouping is "young old" (65–74), "old" (74–84), and "old-old" (85+).
Delineating sub-groups in the 65+ population enables a more accurate portrayal of significant life changes.
Two British scholars, Paul Higgs and Chris Gilleard, have added a "fourth age" sub-group. In British English, the "third age" is "the period in life of active retirement, following middle age". Higgs and Gilleard describe the fourth age as "an arena of inactive, unhealthy, unproductive, and ultimately unsuccessful ageing."
Dimensions of old age:
Key Concepts in Social Gerontology lists four dimensions: chronological, biological, psychological, and social.
Wattis and Curran add a fifth dimension: developmental.
Chronological age may differ considerably from a person's functional age. The distinguishing marks of old age normally occur in all five senses at different times and different rates for different persons. In addition to chronological age, people can be considered old because of the other dimensions of old age. For example, people may be considered old when they become grandparents or when they begin to do less or different work in retirement.
Senior citizen:
Senior citizen is a common euphemism for an old person used in American English, and sometimes in British English. It implies that the person being referred to is retired. This in turn usually implies that the person is over the retirement age, which varies according to country. Synonyms include old age pensioner or pensioner in British English, and retiree and senior in American English. Some dictionaries describe widespread use of "senior citizen" for people over the age of 65.
When defined in an official context, senior citizen is often used for legal or policy-related reasons in determining who is eligible for certain benefits available to the age group.
It is used in general usage instead of traditional terms such as old person, old-age pensioner, or elderly as a courtesy and to signify continuing relevance of and respect for this population group as "citizens" of society, of senior rank.
The term was apparently coined in 1938 during a political campaign. Famed caricaturist Al Hirschfeld claimed on several occasion that his father Isaac Hirschfeld invented the term 'senior citizen'. It has come into widespread use in recent decades in legislation, commerce, and common speech. Especially in less formal contexts, it is often abbreviated as "senior(s)", which is also used as an adjective.
In commerce, some businesses offer customers of a certain age a "senior discount". The age at which these discounts are available varies between 55, 60, 62 or 65, and other criteria may also apply. Sometimes a special "senior discount card" or other proof of age needs to be obtained and produced to show entitlement.
Age Qualifications:
The age which qualifies for senior citizen status varies widely. In governmental contexts, it is usually associated with an age at which pensions or medical benefits for the elderly become available. In commercial contexts, where it may serve as a marketing device to attract customers, the age is often significantly lower.
In the United States, the standard retirement age is currently 66 (gradually increasing to 67).
In Canada, the OAS (Old Age Security) pension is available at 65 (the Conservative government of Stephen Harper had planned to gradually increase the age of eligibility to 67, starting in the years 2023–2029, although the Liberal government of Justin Trudeau is considering leaving it at 65), and the CPP (Canada Pension Plan) as early as age 60.
The AARP allows couples in which one spouse has reached the age of 50 to join, regardless of the age of the other spouse.
Marks of Old Age:
The distinguishing characteristics of old age are both physical and mental. The marks of old age are so unlike the marks of middle age that legal scholar Richard Posner suggests that, as an individual transitions into old age, he/she can be thought of as different persons "time-sharing" the same identity.
These marks do not occur at the same chronological age for everyone. Also, they occur at different rates and order for different people. Marks of old age can easily vary between people of the same chronological age.
A basic mark of old age that affects both body and mind is "slowness of behavior." This "slowing down principle" finds a correlation between advancing age and slowness of reaction and physical and mental task performance. However, studies from Buffalo University and Northwestern University have shown that the elderly are a happier age group than their younger counterparts.
Physical marks of old age: include the following:
Mental marks of old age include the following.
Click on any of the following blue hyperlinks for more about Old Age:
Old people often have limited regenerative abilities and are more susceptible to disease, syndromes, and sickness than younger adults. The organic process of ageing is called senescence, the medical study of the aging process is called gerontology, and the study of diseases that afflict the elderly is called geriatrics. The elderly also face other social issues around retirement, loneliness, and ageism.
Old age is a social construct rather than a definite biological stage, and the chronological age denoted as "old age" varies culturally and historically.
In 2011, the United Nations proposed a human rights convention that would specifically protect older persons.
Definitions of old age include official definitions, sub-group definitions, and four dimensions as follows.
Official Definitions:
Old age comprises "the later part of life; the period of life after youth and middle age . . ., usually with reference to deterioration". At what age old age begins cannot be universally defined because it differs according to the context.
Most developed-world countries have accepted the chronological age of 65 years as a definition of 'elderly' or older person. The United Nations has agreed that 60+ years may be usually denoted as old age and this is the first attempt at an international definition of old age.
However, for its study of old age in Africa, the World Health Organization (WHO) set 50 as the beginning of old age. At the same time, the WHO recognized that the developing world often defines old age, not by years, but by new roles, loss of previous roles, or inability to make active contributions to society.
Most developed Western countries set the age of 60 to 65 for retirement. Being 60–65 years old is usually a requirement for becoming eligible for senior social programs. However, various countries and societies consider the onset of old age as anywhere from the mid-40s to the 70s.
The definitions of old age continue to change especially as life expectancy in developed countries has risen to beyond 80 years old. In October 2016, a paper published in the science journal Nature presented the conclusion that the maximum human lifespan is an average age of 115, with an absolute upper limit of 125 years. However, the authors' methods and conclusions drew criticism from the scientific community, who concluded that the study was flawed.
Sub-group definitions:
Gerontologists have recognized the very different conditions that people experience as they grow older within the years defined as old age. In developed countries, most people in their 60s and early 70s are still fit, active, and able to care for themselves. However, after 75, they will become increasingly frail, a condition marked by serious mental and physical debilitation.
Therefore, rather than lumping together all people who have been defined as old, some gerontologists have recognized the diversity of old age by defining sub-groups. One study distinguishes the young old (60 to 69), the middle old (70 to 79), and the very old (80+).
Another study's sub-grouping is young-old (65 to 74), middle-old (75–84), and oldest-old (85+). A third sub-grouping is "young old" (65–74), "old" (74–84), and "old-old" (85+).
Delineating sub-groups in the 65+ population enables a more accurate portrayal of significant life changes.
Two British scholars, Paul Higgs and Chris Gilleard, have added a "fourth age" sub-group. In British English, the "third age" is "the period in life of active retirement, following middle age". Higgs and Gilleard describe the fourth age as "an arena of inactive, unhealthy, unproductive, and ultimately unsuccessful ageing."
Dimensions of old age:
Key Concepts in Social Gerontology lists four dimensions: chronological, biological, psychological, and social.
Wattis and Curran add a fifth dimension: developmental.
Chronological age may differ considerably from a person's functional age. The distinguishing marks of old age normally occur in all five senses at different times and different rates for different persons. In addition to chronological age, people can be considered old because of the other dimensions of old age. For example, people may be considered old when they become grandparents or when they begin to do less or different work in retirement.
Senior citizen:
Senior citizen is a common euphemism for an old person used in American English, and sometimes in British English. It implies that the person being referred to is retired. This in turn usually implies that the person is over the retirement age, which varies according to country. Synonyms include old age pensioner or pensioner in British English, and retiree and senior in American English. Some dictionaries describe widespread use of "senior citizen" for people over the age of 65.
When defined in an official context, senior citizen is often used for legal or policy-related reasons in determining who is eligible for certain benefits available to the age group.
It is used in general usage instead of traditional terms such as old person, old-age pensioner, or elderly as a courtesy and to signify continuing relevance of and respect for this population group as "citizens" of society, of senior rank.
The term was apparently coined in 1938 during a political campaign. Famed caricaturist Al Hirschfeld claimed on several occasion that his father Isaac Hirschfeld invented the term 'senior citizen'. It has come into widespread use in recent decades in legislation, commerce, and common speech. Especially in less formal contexts, it is often abbreviated as "senior(s)", which is also used as an adjective.
In commerce, some businesses offer customers of a certain age a "senior discount". The age at which these discounts are available varies between 55, 60, 62 or 65, and other criteria may also apply. Sometimes a special "senior discount card" or other proof of age needs to be obtained and produced to show entitlement.
Age Qualifications:
The age which qualifies for senior citizen status varies widely. In governmental contexts, it is usually associated with an age at which pensions or medical benefits for the elderly become available. In commercial contexts, where it may serve as a marketing device to attract customers, the age is often significantly lower.
In the United States, the standard retirement age is currently 66 (gradually increasing to 67).
In Canada, the OAS (Old Age Security) pension is available at 65 (the Conservative government of Stephen Harper had planned to gradually increase the age of eligibility to 67, starting in the years 2023–2029, although the Liberal government of Justin Trudeau is considering leaving it at 65), and the CPP (Canada Pension Plan) as early as age 60.
The AARP allows couples in which one spouse has reached the age of 50 to join, regardless of the age of the other spouse.
Marks of Old Age:
The distinguishing characteristics of old age are both physical and mental. The marks of old age are so unlike the marks of middle age that legal scholar Richard Posner suggests that, as an individual transitions into old age, he/she can be thought of as different persons "time-sharing" the same identity.
These marks do not occur at the same chronological age for everyone. Also, they occur at different rates and order for different people. Marks of old age can easily vary between people of the same chronological age.
A basic mark of old age that affects both body and mind is "slowness of behavior." This "slowing down principle" finds a correlation between advancing age and slowness of reaction and physical and mental task performance. However, studies from Buffalo University and Northwestern University have shown that the elderly are a happier age group than their younger counterparts.
Physical marks of old age: include the following:
- Bone and joint. Old bones are marked by "thinning and shrinkage." This might result in a loss of height (about two inches (5 cm) by age 80), a stooping posture in many people, and a greater susceptibility to bone and joint diseases such as osteoarthritis and osteoporosis.
- Chronic diseases. Some older persons have at least one chronic condition and many have multiple conditions. In 2007–2009, the most frequently occurring conditions among older persons in the United States were uncontrolled hypertension (34%), diagnosed arthritis (50%), and heart disease (32%).
- Chronic mucus hypersecretion (CMH) "defined as coughing and bringing up sputum . . . is a common respiratory symptom in elderly persons."
- Dental problems. May have less saliva and less ability for oral hygiene in old age which increases the chance of tooth decay and infection.
- Digestive system. About 40% of the time, old age is marked by digestive disorders such as difficulty in swallowing, inability to eat enough and to absorb nutrition, constipation and bleeding.
- Essential Tremor (ET) is an uncontrollable shaking in a part of the upper body. It is more common in the elderly and symptoms worsen with age.
- Eyesight. Presbyopia can occur by age 50 and it hinders reading especially of small print in low lighting. Speed with which an individual reads and the ability to locate objects may also be impaired. By age 80, more than half of all Americans either have a cataract or have had cataract surgery.
- Falls. Old age spells risk for injury from falls that might not cause injury to a younger person. Every year, about one-third of those 65 years old and over half of those 80 years old fall. Falls are the leading cause of injury and death for old people.
- Gait change. Some aspects of gait normally change with old age. Gait velocity slows after age 70. Double stance time (i.e., time with both feet on the ground) also increases with age. Because of gait change, old people sometimes appear to be walking on ice.
- Hair usually becomes grayer and also might become thinner. As a rule of thumb, around age 50, about 50% of Europeans have 50% grey hair. Many men are affected by balding, and women enter menopause.
- Hearing. By age 75 and older, 48% of men and 37% of women encounter impairments in hearing. Of the 26.7 million people over age 50 with a hearing impairment, only one in seven uses a hearing aid. In the 70–79 age range, the incidence of partial hearing loss affecting communication rises to 65%, predominantly among low-income males.
- Hearts can become less efficient in old age with a resulting loss of stamina. In addition, atherosclerosis can constrict blood flow.
- Immune function. Less efficient immune function (Immunosenescence) is a mark of old age.
- Lungs might expand less well; thus, they provide less oxygen.
- Mobility impairment or loss. "Impairment in mobility affects 14% of those between 65 and 74, but half of those over 85." Loss of mobility is common in old people. This inability to get around has serious "social, psychological, and physical consequences".
- Pain afflicts old people at least 25% of the time, increasing with age up to 80% for those in nursing homes. Most pains are rheumatological or malignant.
- Sexuality remains important throughout the lifespan and the sexual expression of "typical, healthy older persons is a relatively neglected topic of research". Sexual attitudes and identity are established in early adulthood and change minimally over the course of a lifetime. However, sexual drive in both men and women may decrease as they age. That said, there is a growing body of research on people's sexual behaviors and desires in later life that challenges the "asexual" image of older adults. People aged 75–102 continue to experience sensuality and sexual pleasure. Other known sexual behaviors in older age groups include sexual thoughts, fantasies and erotic dreams, masturbation, oral sex, vaginal and anal intercourse.
- Skin loses elasticity, becomes drier, and more lined and wrinkled.
- Sleep trouble holds a chronic prevalence of over 50% in old age and results in daytime sleepiness. In a study of 9,000 persons with a mean age of 74, only 12% reported no sleep complaints. By age 65, deep sleep goes down to about 5%.
- Taste buds diminish so that by age 80 taste buds are down to 50% of normal. Food becomes less appealing and nutrition can suffer.
- Over the age of 85, thirst perception decreases, such that 41% of the elderly drink insufficiently.
- Urinary incontinence is often found in old age.
- Voice. In old age, vocal cords weaken and vibrate more slowly. This results in a weakened, breathy voice that is sometimes called an "old person's voice."
Mental marks of old age include the following.
- Adaptable describes most people in their old age. Despite the stress of old age, they are described as "agreeable" and "accepting." However, old age dependence induces feelings of incompetence and worthlessness in a minority.
- Caution marks old age. This antipathy toward "risk-taking" stems from the fact that old people have less to gain and more to lose by taking risks than younger people.
- Depressed mood. According to Cox, Abramson, Devine, and Hollon (2012), old age is a risk factor for depression caused by prejudice (i.e., "deprejudice"). When people are prejudiced against the elderly and then become old themselves, their anti-elderly prejudice turns inward, causing depression. "People with more negative age stereotypes will likely have higher rates of depression as they get older." Old age depression results in the over-65 population having the highest suicide rate.
- Fear of crime in old age, especially among the frail, sometimes weighs more heavily than concerns about finances or health and restricts what they do. The fear persists in spite of the fact that old people are victims of crime less often than younger people.
- Mental disorders afflict about 15% of people aged 60+ according to estimates by the World Health Organization. Another survey taken in 15 countries reported that mental disorders of adults interfered with their daily activities more than physical problems.
- Reduced mental and cognitive ability may afflict old age. Memory loss is common in old age due to the decrease in speed of information being encoded, stored, and retrieved. It takes more time to learn new information. Dementia is a general term for memory loss and other intellectual abilities serious enough to interfere with daily life. Its prevalence increases in old age from about 10% at age 65 to about 50% over age 85. Alzheimer's disease accounts for 50 to 80 percent of dementia cases. Demented behavior can include wandering, physical aggression, verbal outbursts, depression, and psychosis.
- Set in one's ways describes a mind set of old age. A study of over 400 distinguished men and women in old age found a "preference for the routine." Explanations include old age's toll on the "fluid intelligence" and the "more deeply entrenched" ways of the old.
Click on any of the following blue hyperlinks for more about Old Age:
- Perceptions of old age
- Old age from a middle-age perspective
Old age from an old-age perspective
Old age from society's perspective
Old age from simulated perspective
- Old age from a middle-age perspective
- Old age frailty
- Prevalence of frailty
Markers of frailty
Misconceptions of frail people
Care and costs
Death and frailty
- Prevalence of frailty
- Religiosity in old age
- Demographic changes
- Psychosocial aspects
- Theories of old age
- Life expectancy
- Old age benefits
- Assistance: devices and personal
Human Rights including Universal Declaration by the United Nations
YouTube Video: What are the basic universal human rights?
Pictured below: United Nations Web Page about Universal Human Rights
Human rights are moral principles or norm that describe certain standards of human behavior, and are regularly protected as natural and legal rights in municipal and international law. They are commonly understood as inalienable fundamental rights "to which a person is inherently entitled simply because she or he is a human being", and which are "inherent in all human beings" regardless of their nation, location, language, religion, ethnic origin or any other status.
These rights are applicable everywhere and at every time in the sense of being universal, and they are egalitarian in the sense of being the same for everyone.
They are regarded as requiring empathy and the rule of law and imposing an obligation on persons to respect the human rights of others, and it is generally considered that they should not be taken away except as a result of due process based on specific circumstances; for example, human rights may include freedom from unlawful imprisonment, torture and execution.
The doctrine of human rights has been highly influential within international law, global and regional institutions. Actions by states and non-governmental organisations form a basis of public policy worldwide. The idea of human rights suggests that "if the public discourse of peacetime global society can be said to have a common moral language, it is that of human rights".
The strong claims made by the doctrine of human rights continue to provoke considerable scepticism and debates about the content, nature and justifications of human rights to this day.
The precise meaning of the term right is controversial and is the subject of continued philosophical debate; while there is consensus that human rights encompasses a wide variety of rights such as the right to a fair trial, protection against enslavement, prohibition of genocide, free speech, or a right to education (including the right to comprehensive sexuality education, among others), there is disagreement about which of these particular rights should be included within the general framework of human rights; some thinkers suggest that human rights should be a minimum requirement to avoid the worst-case abuses, while others see it as a higher standard.
Many of the basic ideas that animated the human rights movement developed in the aftermath of the Second World War and the events of the Holocaust, culminating in the adoption of the Universal Declaration of Human Rights in Paris by the United Nations General Assembly in 1948.
Ancient peoples did not have the same modern-day conception of universal human rights. The true forerunner of human rights discourse was the concept of natural rights which appeared as part of the medieval natural law tradition that became prominent during the European Enlightenment with such philosophers as John Locke, Francis Hutcheson and Jean-Jacques Burlamaqui, and which featured prominently in the political discourse of the American Revolution and the French Revolution.
From this foundation, the modern human rights arguments emerged over the latter half of the 20th century, possibly as a reaction to slavery, torture, genocide and war crimes, as a realization of inherent human vulnerability and as being a precondition for the possibility of a just society.
Click on any of the following blue hyperlinks for more about Human Rights:
Universal Declaration of Human Rights by the United Nations:
The Universal Declaration of Human RightsThe Universal Declaration of Human Rights (UDHR) is a milestone document in the history of human rights.
Drafted by representatives with different legal and cultural backgrounds from all regions of the world, the Declaration was proclaimed by the United Nations General Assembly in Paris on 10 December 1948 (General Assembly resolution 217 A) as a common standard of achievements for all peoples and all nations.
It sets out, for the first time, fundamental human rights to be universally protected and it has been translated into over 500 languages.
Click here to download a Copy of the PDF.
These rights are applicable everywhere and at every time in the sense of being universal, and they are egalitarian in the sense of being the same for everyone.
They are regarded as requiring empathy and the rule of law and imposing an obligation on persons to respect the human rights of others, and it is generally considered that they should not be taken away except as a result of due process based on specific circumstances; for example, human rights may include freedom from unlawful imprisonment, torture and execution.
The doctrine of human rights has been highly influential within international law, global and regional institutions. Actions by states and non-governmental organisations form a basis of public policy worldwide. The idea of human rights suggests that "if the public discourse of peacetime global society can be said to have a common moral language, it is that of human rights".
The strong claims made by the doctrine of human rights continue to provoke considerable scepticism and debates about the content, nature and justifications of human rights to this day.
The precise meaning of the term right is controversial and is the subject of continued philosophical debate; while there is consensus that human rights encompasses a wide variety of rights such as the right to a fair trial, protection against enslavement, prohibition of genocide, free speech, or a right to education (including the right to comprehensive sexuality education, among others), there is disagreement about which of these particular rights should be included within the general framework of human rights; some thinkers suggest that human rights should be a minimum requirement to avoid the worst-case abuses, while others see it as a higher standard.
Many of the basic ideas that animated the human rights movement developed in the aftermath of the Second World War and the events of the Holocaust, culminating in the adoption of the Universal Declaration of Human Rights in Paris by the United Nations General Assembly in 1948.
Ancient peoples did not have the same modern-day conception of universal human rights. The true forerunner of human rights discourse was the concept of natural rights which appeared as part of the medieval natural law tradition that became prominent during the European Enlightenment with such philosophers as John Locke, Francis Hutcheson and Jean-Jacques Burlamaqui, and which featured prominently in the political discourse of the American Revolution and the French Revolution.
From this foundation, the modern human rights arguments emerged over the latter half of the 20th century, possibly as a reaction to slavery, torture, genocide and war crimes, as a realization of inherent human vulnerability and as being a precondition for the possibility of a just society.
Click on any of the following blue hyperlinks for more about Human Rights:
- History of the concept
- Philosophy
- Criticism
- Classification
- International protection and promotion
- Non-governmental actors
- Violations
- Substantive rights
- Relationship with other topics
- See also:
- Human rights portal
- Children's rights
- Fundamental rights
- Human rights in cyberspace
- Human rights group
- Human rights literature
- Human Rights Watch
- International human rights law
- International human rights instruments
- Intersex human rights
- List of human rights organisations
- LGBT rights
- Minority rights
- Public international law
- International Year of Human Rights
- European Court of Human Rights
- "Human rights". Internet Encyclopedia of Philosophy.
- UN Practitioner's Portal on HRBA Programming UN centralized web portal on the Human Rights-Based Approach to Development Programming
- Simple Guide to the UN Treaty Bodies (International Service for Human Rights)
- Country Reports on Human Rights Practices U.S. Department of State.
- International Center for Transitional Justice (ICTJ)
- The International Institute of Human Rights
- IHRLaw.org International Human Rights Law – comprehensive online resources and news
Universal Declaration of Human Rights by the United Nations:
The Universal Declaration of Human RightsThe Universal Declaration of Human Rights (UDHR) is a milestone document in the history of human rights.
Drafted by representatives with different legal and cultural backgrounds from all regions of the world, the Declaration was proclaimed by the United Nations General Assembly in Paris on 10 December 1948 (General Assembly resolution 217 A) as a common standard of achievements for all peoples and all nations.
It sets out, for the first time, fundamental human rights to be universally protected and it has been translated into over 500 languages.
Click here to download a Copy of the PDF.
21st Century including a Timeline of the 21st Century
- YouTube Video of the 9/11 Terrorist attack on America (9/11/2001)
- YouTube Video: Space Shuttle Columbia Disaster
- YouTube Video: The Assassination of Osama Bin Laden (5/2/2011)
Click here for the Timeline of the 21st Century -- by Year.
The 21st (twenty-first) century is the current century of the Anno Domini era or Common Era, in accordance with the Gregorian calendar. It began on January 1, 2001, and will end on December 31, 2100. It is the first century of the 3rd millennium. It is distinct from the century known as the 2000s which began on January 1, 2000 and will end on December 31, 2099.
The first years of the 21st century have thus far been marked by the rise of a global economy and Third World consumerism, mistrust in government, deepening global concern over terrorism and an increase in the power of private enterprise.
The Arab Spring of the early 2010s led to mixed outcomes in the Arab world.
The Third Industrial Revolution which began around the 1980s also continues into the present, and is expected to transition into Industry 4.0 and the Fourth Industrial Revolution by as early as 2030.
Millennials and Generation Z come of age and rise to prominence in this century.
In 2016, the United Kingdom decided to leave the European Union, causing Brexit.
Click on any of the following blue hyperlinks for more about the 21st Century:
The 21st (twenty-first) century is the current century of the Anno Domini era or Common Era, in accordance with the Gregorian calendar. It began on January 1, 2001, and will end on December 31, 2100. It is the first century of the 3rd millennium. It is distinct from the century known as the 2000s which began on January 1, 2000 and will end on December 31, 2099.
The first years of the 21st century have thus far been marked by the rise of a global economy and Third World consumerism, mistrust in government, deepening global concern over terrorism and an increase in the power of private enterprise.
The Arab Spring of the early 2010s led to mixed outcomes in the Arab world.
The Third Industrial Revolution which began around the 1980s also continues into the present, and is expected to transition into Industry 4.0 and the Fourth Industrial Revolution by as early as 2030.
Millennials and Generation Z come of age and rise to prominence in this century.
In 2016, the United Kingdom decided to leave the European Union, causing Brexit.
Click on any of the following blue hyperlinks for more about the 21st Century:
- Transitions and changes
- Pronunciation
- Events
- Politics and wars
- Science and technology
- Civil unrest
- Linguistic diversity
- Disasters
- Economics and industry
- Sports
- Arts and entertainment
- Issues and concerns
- Astronomical events
- See also:
- 21st century in fiction
- Timelines of modern history
- Contemporary art
- Reuters – The State of the World The story of the 21st century
- Long Bets Foundation to promote long-term thinking
- Long Now Long-term cultural institution
- Scientific American Magazine (September 2005 Issue) The Climax of Humanity
- MapReport 21st Century Event World Map
Human Dignity
- YouTube Video by Pope Francis: Peace depends on human dignity
- YouTube Video: GCSE Human Dignity animation | CAFOD
- YouTube Video: Life and Dignity of the Human Person
Dignity is the right of a person to be valued and respected for their own sake, and to be treated ethically. It is of significance in morality, ethics, law and politics as an extension of the Enlightenment-era concepts of inherent, inalienable rights. The term may also be used to describe personal conduct, as in "behaving with dignity".
The English word "dignity", attested from the early 13th century, comes from Latin dignitas (worthiness) by way of French dignité.
Modern use:
English-speakers often use the word "dignity" in proscriptive and cautionary ways: for example, in politics it can be used to critique the treatment of oppressed and vulnerable groups and peoples, but it has also been applied to cultures and sub-cultures, to religious beliefs and ideals, and even to animals used for food or research.
"Dignity" also has descriptive meanings pertaining to the worth of human beings. In general, the term has various functions and meanings depending on how the term is used and on the context.
In ordinary modern usage, the word denotes "respect" and "status", and it is often used to suggest that someone is not receiving a proper degree of respect, or even that they are failing to treat themselves with proper self-respect.
There is also a long history of special philosophical use of this term. However, it is rarely defined outright in political, legal, and scientific discussions. International proclamations have thus far left dignity undefined, and scientific commentators, such as those arguing against genetic research and algeny, cite dignity as a reason but are ambiguous about its application.
Violations of Human Dignity:
Categories:
Human dignity can be violated in multiple ways. The main categories of violations are:
Violations of human dignity in terms of humiliation refer to acts that humiliate or diminish the self-worth of a person or a group. Acts of humiliation are context dependent but we normally have an intuitive understanding where such a violation occurs. As Schachter noted, “it has been generally assumed that a violation of human dignity can be recognized even if the abstract term cannot be defined. ‘I know it when I see it even if I cannot tell you what it is’”.
More generally, etymology of the word “humiliation” has a universal characteristic in the sense that in all languages the word involves “downward spatial orientation” in which “something or someone is pushed down and forcefully held there”.
This approach is common in judicial decisions where judges refer to violations of human dignity as injuries to people's self-worth or their self-esteem.
This aspect refers to treating a person as an instrument or as means to achieve some other goal. This approach builds on Immanuel Kant's moral imperative stipulating that we should treat people as ends or goals in themselves, namely as having ultimate moral worth which should not be instrumentalized.
Violations of human dignity as degradation refer to acts that degrade the value of human beings. These are acts that, even if done by consent, convey a message that diminishes the importance or value of all human beings. They consist of practices and acts that modern society generally considers unacceptable for human beings, regardless of whether subjective humiliation is involved, such as selling oneself to slavery, or when a state authority deliberately puts prisoners in inhuman living conditions.
These are acts that strip a person or a group of their human characteristics. It may involve describing or treating them as animals or as a lower type of human beings. This has occurred in genocides such as the Holocaust and in Rwanda where the minority were compared to insects.
Examples:
Some of the practices that violate human dignity include the following:
Both absolute and relative poverty are violations of human dignity, although they also have other significant dimensions, such as social injustice.
Absolute poverty is associated with overt exploitation and connected to humiliation (for example, being forced to eat food from other people's garbage), but being dependent upon others to stay alive is a violation of dignity even in the absence of more direct violations.
Relative poverty, on the other hand, is a violation because the cumulative experience of not being able to afford the same clothes, entertainment, social events, education, or other features of typical life in that society results in subtle humiliation; social rejection; marginalization; and consequently, a diminished self-respect.
Another example of violation of human dignity, especially for women in developing countries, is lack of sanitation. Having no access to toilets leaves currently about 1 billion people of the world with no choice other than to defecation in the open, which has been declared by the Deputy Secretary-General of the United Nations as an affront to personal dignity.
Human dignity is also violated by the practice of employing people in India for "manual scavenging" of human excreta from unsanitary toilets – usually by people of a lower caste, and more often by women than men.
A further example of violation of human dignity, affecting women mainly in developing countries, is female genital mutilation (FGM).
The movie The Magic Christian depicts a wealthy man (Peter Sellers) and his son (Ringo Starr) who test the limits of dignity by forcing people to perform self-degrading acts for money. The Simpsons episode "Homer vs. Dignity" has a similar plot.
Click on any of the following blue hyperlinks for more about Human Dignity:
The English word "dignity", attested from the early 13th century, comes from Latin dignitas (worthiness) by way of French dignité.
Modern use:
English-speakers often use the word "dignity" in proscriptive and cautionary ways: for example, in politics it can be used to critique the treatment of oppressed and vulnerable groups and peoples, but it has also been applied to cultures and sub-cultures, to religious beliefs and ideals, and even to animals used for food or research.
"Dignity" also has descriptive meanings pertaining to the worth of human beings. In general, the term has various functions and meanings depending on how the term is used and on the context.
In ordinary modern usage, the word denotes "respect" and "status", and it is often used to suggest that someone is not receiving a proper degree of respect, or even that they are failing to treat themselves with proper self-respect.
There is also a long history of special philosophical use of this term. However, it is rarely defined outright in political, legal, and scientific discussions. International proclamations have thus far left dignity undefined, and scientific commentators, such as those arguing against genetic research and algeny, cite dignity as a reason but are ambiguous about its application.
Violations of Human Dignity:
Categories:
Human dignity can be violated in multiple ways. The main categories of violations are:
Violations of human dignity in terms of humiliation refer to acts that humiliate or diminish the self-worth of a person or a group. Acts of humiliation are context dependent but we normally have an intuitive understanding where such a violation occurs. As Schachter noted, “it has been generally assumed that a violation of human dignity can be recognized even if the abstract term cannot be defined. ‘I know it when I see it even if I cannot tell you what it is’”.
More generally, etymology of the word “humiliation” has a universal characteristic in the sense that in all languages the word involves “downward spatial orientation” in which “something or someone is pushed down and forcefully held there”.
This approach is common in judicial decisions where judges refer to violations of human dignity as injuries to people's self-worth or their self-esteem.
- Instrumentalization or objectification
This aspect refers to treating a person as an instrument or as means to achieve some other goal. This approach builds on Immanuel Kant's moral imperative stipulating that we should treat people as ends or goals in themselves, namely as having ultimate moral worth which should not be instrumentalized.
- Degradation
Violations of human dignity as degradation refer to acts that degrade the value of human beings. These are acts that, even if done by consent, convey a message that diminishes the importance or value of all human beings. They consist of practices and acts that modern society generally considers unacceptable for human beings, regardless of whether subjective humiliation is involved, such as selling oneself to slavery, or when a state authority deliberately puts prisoners in inhuman living conditions.
These are acts that strip a person or a group of their human characteristics. It may involve describing or treating them as animals or as a lower type of human beings. This has occurred in genocides such as the Holocaust and in Rwanda where the minority were compared to insects.
Examples:
Some of the practices that violate human dignity include the following:
Both absolute and relative poverty are violations of human dignity, although they also have other significant dimensions, such as social injustice.
Absolute poverty is associated with overt exploitation and connected to humiliation (for example, being forced to eat food from other people's garbage), but being dependent upon others to stay alive is a violation of dignity even in the absence of more direct violations.
Relative poverty, on the other hand, is a violation because the cumulative experience of not being able to afford the same clothes, entertainment, social events, education, or other features of typical life in that society results in subtle humiliation; social rejection; marginalization; and consequently, a diminished self-respect.
Another example of violation of human dignity, especially for women in developing countries, is lack of sanitation. Having no access to toilets leaves currently about 1 billion people of the world with no choice other than to defecation in the open, which has been declared by the Deputy Secretary-General of the United Nations as an affront to personal dignity.
Human dignity is also violated by the practice of employing people in India for "manual scavenging" of human excreta from unsanitary toilets – usually by people of a lower caste, and more often by women than men.
A further example of violation of human dignity, affecting women mainly in developing countries, is female genital mutilation (FGM).
The movie The Magic Christian depicts a wealthy man (Peter Sellers) and his son (Ringo Starr) who test the limits of dignity by forcing people to perform self-degrading acts for money. The Simpsons episode "Homer vs. Dignity" has a similar plot.
Click on any of the following blue hyperlinks for more about Human Dignity:
- Philosophical history
- Religion
- United Nations Universal Declaration of Human Rights
- Medicine
- Law
- See also
Mortality Rate
- YouTube Video: Cancer Incidence and Mortality Through 2020
- YouTube Video Public Health Approaches to Reducing U.S. Infant Mortality
- YouTube Video: The brain-changing benefits of exercise | Wendy Suzuki
Mortality rate, or death rate, is a measure of the number of deaths (in general, or due to a specific cause) in a particular population, scaled to the size of that population, per unit of time.
Mortality rate is typically expressed in units of deaths per 1,000 individuals per year; thus, a mortality rate of 9.5 (out of 1,000) in a population of 1,000 would mean 9.5 deaths per year in that entire population, or 0.95% out of the total. It is distinct from "morbidity", which is either the prevalence or incidence of a disease, and also from the incidence rate (the number of newly appearing cases of the disease per unit of time).
In the generic form, mortality rates are calculated as:
Related measures of mortality:
Other specific measures of mortality include:
Use in epidemiology:
In most cases, there are few ways, if at all possible to obtain exact mortality rates, so epidemiologists use estimation to predict correct mortality rates. Mortality rates are usually difficult to predict due to language barriers, health infrastructure related issues, conflict, and other reasons.
Maternal mortality has additional challenges, especially as they pertain to stillbirths, abortions, and multiple births. In some countries, during the 1920s, a stillbirth was defined as "a birth of at least twenty weeks' gestation in which the child shows no evidence of life after complete birth". In most countries, however, a stillbirth was defined as "the birth of a fetus, after 28 weeks of pregnancy, in which pulmonary respiration does not occur".
Census data and vital statistics:
Ideally, all mortality estimation would be done using vital statistics and census data. Census data will give detailed information about the population at risk of death. The vital statistics provide information about live births and deaths in the population. Often, either census data and vital statistics data is not available. This is especially true in developing countries, countries that are in conflict, areas where natural disasters have caused mass displacement, and other areas where there is a humanitarian crisis
Household surveys:
Household surveys or interviews are another way in which mortality rates are often assessed.
There are several methods to estimate mortality in different segments of the population. One such example is the sisterhood method, which involves researchers estimating maternal mortality by contacting women in populations of interest and asking whether or not they have a sister, if the sister is of child-bearing age (usually 15) and conducting an interview or written questions about possible deaths among sisters. The sisterhood method, however, does not work in cases where sisters may have died before the sister being interviewed was born.
Orphanhood surveys estimate mortality by questioning children are asked about the mortality of their parents. It has often been criticized as an adult mortality rate that is very biased for several reasons. The adoption effect is one such instance in which orphans often do not realize that they are adopted. Additionally, interviewers may not realize that an adoptive or foster parent is not the child's biological parent. There is also the issue of parents being reported on by multiple children while some adults have no children, thus are not counted in mortality estimates.
Widowhood surveys estimate adult mortality by responding to questions about the deceased husband or wife. One limitation of the widowhood survey surrounds the issues of divorce, where people may be more likely to report that they are widowed in places where there is the great social stigma around being a divorcee.
Another limitation is that multiple marriages introduce biased estimates, so individuals are often asked about first marriage. Biases will be significant if the association of death between spouses, such as those in countries with large AIDS epidemics.
Sampling:
Sampling refers to the selection of a subset of the population of interest to efficiently gain information about the entire population. Samples should be representative of the population of interest.
Cluster sampling is an approach to non-probability sampling; this is an approach in which each member of the population is assigned to a group (cluster), and then clusters are randomly selected, and all members of selected clusters are included in the sample. Often combined with stratification techniques (in which case it is called multistage sampling), cluster sampling is the approach most often used by epidemiologists. In areas of forced migration, there is more significant sampling error. Thus cluster sampling is not the ideal choice.
Click on any of the following blue hyperlinks for more about The Mortality Rate:
Mortality rate is typically expressed in units of deaths per 1,000 individuals per year; thus, a mortality rate of 9.5 (out of 1,000) in a population of 1,000 would mean 9.5 deaths per year in that entire population, or 0.95% out of the total. It is distinct from "morbidity", which is either the prevalence or incidence of a disease, and also from the incidence rate (the number of newly appearing cases of the disease per unit of time).
In the generic form, mortality rates are calculated as:
- {\displaystyle d/p\cdot 10^{n}}
Related measures of mortality:
Other specific measures of mortality include:
- Crude death rate – the total number of deaths per year per 1,000 people. As of 2017 the crude death rate for the whole world is 8.33 per 1,000 (up from 7.8 per 1,000 in 2016) according to the current CIA World Factbook.
- Perinatal mortality rate – the sum of neonatal deaths and fetal deaths (stillbirths) per 1,000 births.
- Maternal mortality ratio – the number of maternal deaths per 100,000 live births in same time period.
- Maternal mortality rate – the number of maternal deaths per 1,000 women of reproductive age in the population (generally defined as 15–44 years of age).
- Infant mortality rate – the number of deaths of children less than 1 year old per 1,000 live births.
- Child mortality rate: the number of deaths of children less than 5 years old per 1,000 live births.
- Standardized mortality ratio (SMR) – a proportional comparison to the numbers of deaths that would have been expected if the population had been of a standard composition in terms of age, gender, etc.
- Age-specific mortality rate (ASMR) – the total number of deaths per year per 1,000 people of a given age (e.g. age 62 last birthday).
- Cause-specific mortality rate – the mortality rate for a specified cause of death.
- Cumulative death rate: a measure of the (growing) proportion of a group that die over a specified period (often as estimated by techniques that account for missing data by statistical censoring).
- Case fatality rate (CFR) – the proportion of cases of a particular medical condition that lead to death.
- Sex-specific mortality rate - Total number of deaths in a population of a specific sex within a given time interval.
Use in epidemiology:
In most cases, there are few ways, if at all possible to obtain exact mortality rates, so epidemiologists use estimation to predict correct mortality rates. Mortality rates are usually difficult to predict due to language barriers, health infrastructure related issues, conflict, and other reasons.
Maternal mortality has additional challenges, especially as they pertain to stillbirths, abortions, and multiple births. In some countries, during the 1920s, a stillbirth was defined as "a birth of at least twenty weeks' gestation in which the child shows no evidence of life after complete birth". In most countries, however, a stillbirth was defined as "the birth of a fetus, after 28 weeks of pregnancy, in which pulmonary respiration does not occur".
Census data and vital statistics:
Ideally, all mortality estimation would be done using vital statistics and census data. Census data will give detailed information about the population at risk of death. The vital statistics provide information about live births and deaths in the population. Often, either census data and vital statistics data is not available. This is especially true in developing countries, countries that are in conflict, areas where natural disasters have caused mass displacement, and other areas where there is a humanitarian crisis
Household surveys:
Household surveys or interviews are another way in which mortality rates are often assessed.
There are several methods to estimate mortality in different segments of the population. One such example is the sisterhood method, which involves researchers estimating maternal mortality by contacting women in populations of interest and asking whether or not they have a sister, if the sister is of child-bearing age (usually 15) and conducting an interview or written questions about possible deaths among sisters. The sisterhood method, however, does not work in cases where sisters may have died before the sister being interviewed was born.
Orphanhood surveys estimate mortality by questioning children are asked about the mortality of their parents. It has often been criticized as an adult mortality rate that is very biased for several reasons. The adoption effect is one such instance in which orphans often do not realize that they are adopted. Additionally, interviewers may not realize that an adoptive or foster parent is not the child's biological parent. There is also the issue of parents being reported on by multiple children while some adults have no children, thus are not counted in mortality estimates.
Widowhood surveys estimate adult mortality by responding to questions about the deceased husband or wife. One limitation of the widowhood survey surrounds the issues of divorce, where people may be more likely to report that they are widowed in places where there is the great social stigma around being a divorcee.
Another limitation is that multiple marriages introduce biased estimates, so individuals are often asked about first marriage. Biases will be significant if the association of death between spouses, such as those in countries with large AIDS epidemics.
Sampling:
Sampling refers to the selection of a subset of the population of interest to efficiently gain information about the entire population. Samples should be representative of the population of interest.
Cluster sampling is an approach to non-probability sampling; this is an approach in which each member of the population is assigned to a group (cluster), and then clusters are randomly selected, and all members of selected clusters are included in the sample. Often combined with stratification techniques (in which case it is called multistage sampling), cluster sampling is the approach most often used by epidemiologists. In areas of forced migration, there is more significant sampling error. Thus cluster sampling is not the ideal choice.
Click on any of the following blue hyperlinks for more about The Mortality Rate:
- Mortality statistics
- Economics
- See also:
- Biodemography
- Compensation law of mortality
- Demography
- Gompertz–Makeham law of mortality
- List of causes of death by rate
- List of countries by number of deaths
- List of countries by birth rate
- List of countries by death rate
- List of countries by life expectancy
- Maximum life span
- Micromort
- Mortality displacement
- Risk adjusted mortality rate
- Vital statistics
- Medical statistics
- Weekend effect
- Data regarding death rates by age and cause in the United States (from Data360)
- Complex Emergency Database (CE-DAT): Mortality data from conflict-affected populations
- Human Mortality Database: Historic mortality data from developed nations
- Google – public data: Mortality in the U.S.
Natural Selection
- YouTube Video: Natural Selection
- YouTube Video: Examples of Natural Selection
- YouTube Video: Simulating Natural Selection
Natural selection is the differential survival and reproduction of individuals due to differences in phenotype. It is a key mechanism of evolution, the change in the heritable traits characteristic of a population over generations.
Charles Darwin popularized the term "natural selection", contrasting it with artificial selection, which in his view is intentional, whereas natural selection is not.
Variation exists within all populations of organisms. This occurs partly because random mutations arise in the genome of an individual organism, and offspring can inherit such mutations. Throughout the lives of the individuals, their genomes interact with their environments to cause variations in traits.
The environment of a genome includes the molecular biology in the cell, other cells, other individuals, populations, species, as well as the abiotic environment. Because individuals with certain variants of the trait tend to survive and reproduce more than individuals with other, less successful variants, the population evolves. Other factors affecting reproductive success include sexual selection (now often included in natural selection) and fecundity selection.
Natural selection acts on the phenotype, the characteristics of the organism which actually interact with the environment, but the genetic (heritable) basis of any phenotype that gives that phenotype a reproductive advantage may become more common in a population.
Over time, this process can result in populations that specialize for particular ecological niches (microevolution) and may eventually result in speciation (the emergence of new species, macroevolution). In other words, natural selection is a key process in the evolution of a population.
Natural selection is a cornerstone of modern biology. The concept, published by Darwin and Alfred Russel Wallace in a joint presentation of papers in 1858, was elaborated in Darwin's influential 1859 book On the Origin of Species by Means of Natural Selection, or the Preservation of Favoured Races in the Struggle for Life. He described natural selection as analogous to artificial selection, a process by which animals and plants with traits considered desirable by human breeders are systematically favoured for reproduction.
The concept of natural selection originally developed in the absence of a valid theory of heredity; at the time of Darwin's writing, science had yet to develop modern theories of genetics. The union of traditional Darwinian evolution with subsequent discoveries in classical genetics formed the modern synthesis of the mid-20th century. The addition of molecular genetics has led to evolutionary developmental biology, which explains evolution at the molecular level.
While genotypes can slowly change by random genetic drift, natural selection remains the primary explanation for adaptive evolution.
Click on any of the following blue hyperlinks for more about Natural Selection:
Charles Darwin popularized the term "natural selection", contrasting it with artificial selection, which in his view is intentional, whereas natural selection is not.
Variation exists within all populations of organisms. This occurs partly because random mutations arise in the genome of an individual organism, and offspring can inherit such mutations. Throughout the lives of the individuals, their genomes interact with their environments to cause variations in traits.
The environment of a genome includes the molecular biology in the cell, other cells, other individuals, populations, species, as well as the abiotic environment. Because individuals with certain variants of the trait tend to survive and reproduce more than individuals with other, less successful variants, the population evolves. Other factors affecting reproductive success include sexual selection (now often included in natural selection) and fecundity selection.
Natural selection acts on the phenotype, the characteristics of the organism which actually interact with the environment, but the genetic (heritable) basis of any phenotype that gives that phenotype a reproductive advantage may become more common in a population.
Over time, this process can result in populations that specialize for particular ecological niches (microevolution) and may eventually result in speciation (the emergence of new species, macroevolution). In other words, natural selection is a key process in the evolution of a population.
Natural selection is a cornerstone of modern biology. The concept, published by Darwin and Alfred Russel Wallace in a joint presentation of papers in 1858, was elaborated in Darwin's influential 1859 book On the Origin of Species by Means of Natural Selection, or the Preservation of Favoured Races in the Struggle for Life. He described natural selection as analogous to artificial selection, a process by which animals and plants with traits considered desirable by human breeders are systematically favoured for reproduction.
The concept of natural selection originally developed in the absence of a valid theory of heredity; at the time of Darwin's writing, science had yet to develop modern theories of genetics. The union of traditional Darwinian evolution with subsequent discoveries in classical genetics formed the modern synthesis of the mid-20th century. The addition of molecular genetics has led to evolutionary developmental biology, which explains evolution at the molecular level.
While genotypes can slowly change by random genetic drift, natural selection remains the primary explanation for adaptive evolution.
Click on any of the following blue hyperlinks for more about Natural Selection:
- Historical development
- Terminology
- Mechanism
- Classification
- Arms races
- Evolution by means of natural selection
- Genetic basis
- Impact
- See also:
Nature vs. Nurture
- YouTube Video The battle between nature and nurture | Irene Gallego Romero | TEDxNTU
- YouTube Video: Nature vs. Nurture in Child Development
- YouTube Video: Behavioral Theory - Nature vs Nurture Personality?
The nature versus nurture debate involves whether human behavior is determined by the environment, either prenatal or during a person's life, or by a person's genes. The alliterative expression "nature and nurture" in English has been in use since at least the Elizabethan period and goes back to medieval French.
The combination of the two concepts as complementary is ancient. Nature is what we think of as pre-wiring and is influenced by genetic inheritance and other biological factors. Nurture is generally taken as the influence of external factors after conception e.g. the product of exposure, experience and learning on an individual.
The phrase in its modern sense was popularized by the English Victorian polymath Francis Galton, the modern founder of eugenics and behavioral genetics, discussing the influence of heredity and environment on social advancement. Galton was influenced by the book On the Origin of Species written by his half-cousin, Charles Darwin.
The view that humans acquire all or almost all their behavioral traits from "nurture" was termed tabula rasa ("blank slate") by John Locke in 1690. A "blank slate view" in human developmental psychology assuming that human behavioral traits develop almost exclusively from environmental influences, was widely held during much of the 20th century (sometimes termed "blank-slatism").
The debate between "blank-slate" denial of the influence of heritability, and the view admitting both environmental and heritable traits, has often been cast in terms of nature versus nurture. These two conflicting approaches to human development were at the core of an ideological dispute over research agendas throughout the second half of the 20th century. As both "nature" and "nurture" factors were found to contribute substantially, often in an inextricable manner, such views were seen as naive or outdated by most scholars of human development by the 2000s.
The strong dichotomy of nature versus nurture has thus been claimed to have limited relevance in some fields of research. Close feedback loops have been found in which "nature" and "nurture" influence one another constantly, as seen in self-domestication.
In ecology and behavioral genetics, researchers think nurture has an essential influence on nature. Similarly in other fields, the dividing line between an inherited and an acquired trait becomes unclear, as in epigenetics or fetal development.
Click on any of the following blue hyperlinks for more about "Nature vs. Nurture":
The combination of the two concepts as complementary is ancient. Nature is what we think of as pre-wiring and is influenced by genetic inheritance and other biological factors. Nurture is generally taken as the influence of external factors after conception e.g. the product of exposure, experience and learning on an individual.
The phrase in its modern sense was popularized by the English Victorian polymath Francis Galton, the modern founder of eugenics and behavioral genetics, discussing the influence of heredity and environment on social advancement. Galton was influenced by the book On the Origin of Species written by his half-cousin, Charles Darwin.
The view that humans acquire all or almost all their behavioral traits from "nurture" was termed tabula rasa ("blank slate") by John Locke in 1690. A "blank slate view" in human developmental psychology assuming that human behavioral traits develop almost exclusively from environmental influences, was widely held during much of the 20th century (sometimes termed "blank-slatism").
The debate between "blank-slate" denial of the influence of heritability, and the view admitting both environmental and heritable traits, has often been cast in terms of nature versus nurture. These two conflicting approaches to human development were at the core of an ideological dispute over research agendas throughout the second half of the 20th century. As both "nature" and "nurture" factors were found to contribute substantially, often in an inextricable manner, such views were seen as naive or outdated by most scholars of human development by the 2000s.
The strong dichotomy of nature versus nurture has thus been claimed to have limited relevance in some fields of research. Close feedback loops have been found in which "nature" and "nurture" influence one another constantly, as seen in self-domestication.
In ecology and behavioral genetics, researchers think nurture has an essential influence on nature. Similarly in other fields, the dividing line between an inherited and an acquired trait becomes unclear, as in epigenetics or fetal development.
Click on any of the following blue hyperlinks for more about "Nature vs. Nurture":
- History of the debate
- Heritability estimates
- Interaction of genes and environment
- Heritability of intelligence
- Personality traits
- Genetics
- See also:
Human Psychology vs. Human Psychiatry
- YouTube Video: What's the DIFFERENCE? Choosing YOUR Mental Health Professional | Kati Morton
- YouTube Video: You aren't at the mercy of your emotions -- your brain creates them | Lisa Feldman Barrett
- YouTube Video: The Difference between a Psychiatrist and Psychologist
Psychology is the science of behavior and mind. Psychology includes the study of conscious and unconscious phenomena, as well as feeling and thought. It is an academic discipline of immense scope.
Psychologists seek an understanding of the emergent properties of brains, and all the variety of phenomena linked to those emergent properties, joining this way the broader neuroscientific group of researchers. As a social science it aims to understand individuals and groups by establishing general principles and researching specific cases.
In this field, a professional practitioner or researcher is called a psychologist and can be classified as a social, behavioral, or cognitive scientist. Psychologists attempt to understand the role of mental functions in individual and social behavior, while also exploring the physiological and biological processes that underlie cognitive functions and behaviors.
Psychologists explore behavior and mental processes, including:
This extends to interaction between people, such as interpersonal relationships, including psychological resilience, family resilience, and other areas. Psychologists of diverse orientations also consider the unconscious mind.
Psychologists employ empirical methods to infer causal and correlational relationships between psychosocial variables. In addition, or in opposition, to employing empirical and deductive methods, some—especially clinical and counseling psychologists—at times rely upon symbolic interpretation and other inductive techniques.
Psychology has been described as a "hub science" in that medicine tends to draw psychological research via neurology and psychiatry, whereas social sciences most commonly draws directly from sub-disciplines within psychology.
While psychological knowledge is often applied to the assessment and treatment of mental health problems, it is also directed towards understanding and solving problems in several spheres of human activity. By many accounts psychology ultimately aims to benefit society.
The majority of psychologists are involved in some kind of therapeutic role, practicing in clinical, counseling, or school settings. Many do scientific research on a wide range of topics related to mental processes and behavior, and typically work in university psychology departments or teach in other academic settings (e.g., medical schools, hospitals).
Some are employed in industrial and organizational settings, or in other areas such as human development and aging, sports, health, and the media, as well as in forensic investigation and other aspects of law.
Click on any of the following blue hyperlinks for more about Psychology:
Psychiatry is the medical specialty devoted to the diagnosis, prevention, and treatment of mental disorders.
These include various maladaptations related to mood, behaviour, cognition, and perceptions.
See glossary of psychiatry.
Initial psychiatric assessment of a person typically begins with a case history and mental status examination. Physical examinations and psychological tests may be conducted. On occasion, neuroimaging or other neurophysiological techniques are used.
Mental disorders are often diagnosed in accordance with clinical concepts listed in diagnostic manuals such as the International Classification of Diseases (ICD), edited and used by the World Health Organization (WHO) and the widely used Diagnostic and Statistical Manual of Mental Disorders (DSM), published by the American Psychiatric Association (APA).
The fifth edition of the DSM (DSM-5) was published in 2013 which re-organized the larger categories of various diseases and expanded upon the previous edition to include information/insights that are consistent with current research.
The combined treatment of psychiatric medication and psychotherapy has become the most common mode of psychiatric treatment in current practice, but contemporary practice also includes a wide variety of other modalities, e.g., assertive community treatment, community reinforcement, and supported employment. Treatment may be delivered on an inpatient or outpatient basis, depending on the severity of functional impairment or on other aspects of the disorder in question. An inpatient may be treated in a psychiatric hospital.
Research and treatment within psychiatry as a whole are conducted on an interdisciplinary basis, e.g., with:
Click on any of the following blue hyperlinks for more about Psychiatry:
Psychologists seek an understanding of the emergent properties of brains, and all the variety of phenomena linked to those emergent properties, joining this way the broader neuroscientific group of researchers. As a social science it aims to understand individuals and groups by establishing general principles and researching specific cases.
In this field, a professional practitioner or researcher is called a psychologist and can be classified as a social, behavioral, or cognitive scientist. Psychologists attempt to understand the role of mental functions in individual and social behavior, while also exploring the physiological and biological processes that underlie cognitive functions and behaviors.
Psychologists explore behavior and mental processes, including:
- perception,
- cognition,
- attention,
- emotion,
- intelligence,
- subjective experiences,
- motivation,
- brain functioning,
- and personality.
This extends to interaction between people, such as interpersonal relationships, including psychological resilience, family resilience, and other areas. Psychologists of diverse orientations also consider the unconscious mind.
Psychologists employ empirical methods to infer causal and correlational relationships between psychosocial variables. In addition, or in opposition, to employing empirical and deductive methods, some—especially clinical and counseling psychologists—at times rely upon symbolic interpretation and other inductive techniques.
Psychology has been described as a "hub science" in that medicine tends to draw psychological research via neurology and psychiatry, whereas social sciences most commonly draws directly from sub-disciplines within psychology.
While psychological knowledge is often applied to the assessment and treatment of mental health problems, it is also directed towards understanding and solving problems in several spheres of human activity. By many accounts psychology ultimately aims to benefit society.
The majority of psychologists are involved in some kind of therapeutic role, practicing in clinical, counseling, or school settings. Many do scientific research on a wide range of topics related to mental processes and behavior, and typically work in university psychology departments or teach in other academic settings (e.g., medical schools, hospitals).
Some are employed in industrial and organizational settings, or in other areas such as human development and aging, sports, health, and the media, as well as in forensic investigation and other aspects of law.
Click on any of the following blue hyperlinks for more about Psychology:
- Etymology and definitions
- History
- Disciplinary organization
- Major schools of thought
- Themes
- Applications
- Research methods
- Contemporary issues in methodology and practice
- Ethics
Psychiatry is the medical specialty devoted to the diagnosis, prevention, and treatment of mental disorders.
These include various maladaptations related to mood, behaviour, cognition, and perceptions.
See glossary of psychiatry.
Initial psychiatric assessment of a person typically begins with a case history and mental status examination. Physical examinations and psychological tests may be conducted. On occasion, neuroimaging or other neurophysiological techniques are used.
Mental disorders are often diagnosed in accordance with clinical concepts listed in diagnostic manuals such as the International Classification of Diseases (ICD), edited and used by the World Health Organization (WHO) and the widely used Diagnostic and Statistical Manual of Mental Disorders (DSM), published by the American Psychiatric Association (APA).
The fifth edition of the DSM (DSM-5) was published in 2013 which re-organized the larger categories of various diseases and expanded upon the previous edition to include information/insights that are consistent with current research.
The combined treatment of psychiatric medication and psychotherapy has become the most common mode of psychiatric treatment in current practice, but contemporary practice also includes a wide variety of other modalities, e.g., assertive community treatment, community reinforcement, and supported employment. Treatment may be delivered on an inpatient or outpatient basis, depending on the severity of functional impairment or on other aspects of the disorder in question. An inpatient may be treated in a psychiatric hospital.
Research and treatment within psychiatry as a whole are conducted on an interdisciplinary basis, e.g., with:
- epidemiologists,
- mental health counselors,
- nurses,
- psychologists,
- public health specialists,
- radiologists
- or social workers.
Click on any of the following blue hyperlinks for more about Psychiatry:
- Etymology
- Theory and focus
- Clinical application
- Treatment
- History
- Controversy and criticism
- See also:
- Alienist
- Medical psychology
- Biopsychiatry controversy
- Telepsychiatry
- Telemental health
- Bullying in psychiatry
- Psychiatry organizations
- Psychiatry Innovation Lab
- Psychiatry Online
- New York State Psychiatric Institute: Psychiatry Video Archives - Adult Psychiatry Grand Rounds
- Asia-Pacific Psychiatry, official journal of the Pacific Rim College of Psychiatrists.
- Early Intervention in Psychiatry, official journal of the International Early Psychosis Association.
- Psychiatry and Clinical Neurosciences, official journal of the Japanese Society of Psychiatry and Neurology.
Code of Conduct
Pictured below: DSM Code of Business Conduct
- YouTube Video of the U.S. Military Code of Conduct 2014
- YouTube Video: This is our Business Code of Conduct
- YouTube Video: Code of Conduct for the American Red Cross During Disaster Recovery (See below for more*)
Pictured below: DSM Code of Business Conduct
*-- Above YouTube: The Code of Conduct for The International Red Cross and Red Crescent Movement and NGOs in Disaster Relief, was developed and agreed upon by eight of the world's largest disaster response agencies in the summer of 1994. The Code of Conduct, like most professional codes, is a voluntary one. It lays down 10 points of principle which all humanitarian actors should adhere to in their disaster response work, and goes on to describe the relationships that agencies working in disasters should seek with donor governments, host governments and the UN system. The Code is self-policing. There is as yet no international association for disaster-response NGO's which possesses any authority to sanction its members.
A code of conduct is a set of rules outlining the norms, rules, and responsibilities of, and or proper practices for, an individual.
Companies code of conduct:
A company code of conduct is a code of conduct commonly written for employees of a company, which protects the business and informs the employees of the company's expectations. It is appropriate for even the smallest of companies to create a document containing important information on expectations for employees. The document does not need to be complex, or have elaborate policies.
Failure of an employee to follow a company code of conduct can have negative consequences. In Morgan Stanley v. Skowron, 989 F. Supp. 2d 356 (S.D.N.Y. 2013), applying New York's faithless servant doctrine, the court held that a hedge fund's employee engaging in insider trading in violation of his company's code of conduct, which also required him to report his misconduct, must repay his employer the full $31 million his employer paid him as compensation during his period of faithlessness.
In practice:
A code of conduct can be an important part in establishing an inclusive culture, but it is not a comprehensive solution on its own. An ethical culture is created by the organization's leaders who manifest their ethics in their attitudes and behavior.
Studies of codes of conduct in the private sector show that their effective implementation must be part of a learning process that requires training, consistent enforcement, and continuous measurement/improvement.
Simply requiring members to read the code is not enough to ensure that they understand it and will remember its contents. The proof of effectiveness is when employees/members feel comfortable enough to voice concerns and believe that the organization will respond with appropriate action.
Accountants' code of conduct:
In its 2007 International Good Practice Guidance, "Defining and Developing an Effective Code of Conduct for Organizations", the International Federation of Accountants provided the following working definition:
Examples:
A code of conduct is a set of rules outlining the norms, rules, and responsibilities of, and or proper practices for, an individual.
Companies code of conduct:
A company code of conduct is a code of conduct commonly written for employees of a company, which protects the business and informs the employees of the company's expectations. It is appropriate for even the smallest of companies to create a document containing important information on expectations for employees. The document does not need to be complex, or have elaborate policies.
Failure of an employee to follow a company code of conduct can have negative consequences. In Morgan Stanley v. Skowron, 989 F. Supp. 2d 356 (S.D.N.Y. 2013), applying New York's faithless servant doctrine, the court held that a hedge fund's employee engaging in insider trading in violation of his company's code of conduct, which also required him to report his misconduct, must repay his employer the full $31 million his employer paid him as compensation during his period of faithlessness.
In practice:
A code of conduct can be an important part in establishing an inclusive culture, but it is not a comprehensive solution on its own. An ethical culture is created by the organization's leaders who manifest their ethics in their attitudes and behavior.
Studies of codes of conduct in the private sector show that their effective implementation must be part of a learning process that requires training, consistent enforcement, and continuous measurement/improvement.
Simply requiring members to read the code is not enough to ensure that they understand it and will remember its contents. The proof of effectiveness is when employees/members feel comfortable enough to voice concerns and believe that the organization will respond with appropriate action.
Accountants' code of conduct:
In its 2007 International Good Practice Guidance, "Defining and Developing an Effective Code of Conduct for Organizations", the International Federation of Accountants provided the following working definition:
- "Principles, values, standards, or rules of behavior that guide the decisions, procedures and systems of an organization in a way that (a) contributes to the welfare of its key stakeholders, and (b) respects the rights of all constituents affected by its operations."
Examples:
- Banking Code
- Coca-Cola Code of Conduct
- Code of Conduct for the International Red Cross and Red Crescent Movement and NGOs in Disaster Relief
- Code of Hammurabi
- Code of the United States Fighting Force
- Declaration of Geneva
- Declaration of Helsinki
- Don't be evil
- Eight Precepts
- Election Commission of India's Model Code of Conduct
- Ethic of reciprocity (Golden Rule)
- Five Pillars of Islam
- Geneva convention
- Hippocratic Oath
- ICC Cricket Code of Conduct
- International Code of Conduct against Ballistic Missile Proliferation (ICOC or Hague Code of Conduct)
- Israel Defense Forces – Code of Conduct
- Journalist's Creed
- Moral Code of the Builder of Communism
- Patimokkha
- Pirate code of the Brethren
- Psychiatrists' Ethics – Madrid Declaration on Ethical Standards for Psychiatric Practice
- Psychologists' Code of Conduct
- Recurse Center "Social Rules"
- Rule of St. Benedict
- Solicitors Regulation Authority (SRA) Code of Conduct 2011 (for solicitors in the UK)
- Ten Commandments
- Ten Indian commandments
- Ten Precepts (Taoism)
- Uniform Code of Military Justice
- Vienna Convention on Diplomatic Relations
- Warrior code
Human (and other) Respiratory System(s)
- YouTube Video: How the Human Respiratory System Works
- YouTube Video: Comprehending the structure and location of various respiratory organs that together make the respiratory system.
- YouTube Video: The two most common causes of Acute Respiratory Failure
The respiratory system (also respiratory apparatus, ventilatory system) is a biological system consisting of specific organs and structures used for gas exchange in animals and plants.
The anatomy and physiology that make this happen varies greatly, depending on the size of the organism, the environment in which it lives and its evolutionary history. In land animals the respiratory surface is internalized as linings of the lungs.
Gas exchange in the lungs occurs in millions of small air sacs called alveoli in mammals and reptiles, but atria in birds. These microscopic air sacs have a very rich blood supply, thus bringing the air into close contact with the blood.
These air sacs communicate with the external environment via a system of airways, or hollow tubes, of which the largest is the trachea, which branches in the middle of the chest into the two main bronchi. These enter the lungs where they branch into progressively narrower secondary and tertiary bronchi that branch into numerous smaller tubes, the bronchioles.
In birds the bronchioles are termed parabronchi. It is the bronchioles, or parabronchi that generally open into the microscopic alveoli in mammals and atria in birds. Air has to be pumped from the environment into the alveoli or atria by the process of breathing which involves the muscles of respiration.
In most fish, and a number of other aquatic animals (both vertebrates and invertebrates) the respiratory system consists of gills, which are either partially or completely external organs, bathed in the watery environment. This water flows over the gills by a variety of active or passive means. Gas exchange takes place in the gills which consist of thin or very flat filaments and lammelae which expose a very large surface area of highly vascularized tissue to the water.
Other animals, such as insects, have respiratory systems with very simple anatomical features, and in amphibians even the skin plays a vital role in gas exchange. Plants also have respiratory systems but the directionality of gas exchange can be opposite to that in animals. The respiratory system in plants includes anatomical features such as stomata, that are found in various parts of the plant.
Click on any of the following blue hyperlinks for more about Our Respiratory System:
The anatomy and physiology that make this happen varies greatly, depending on the size of the organism, the environment in which it lives and its evolutionary history. In land animals the respiratory surface is internalized as linings of the lungs.
Gas exchange in the lungs occurs in millions of small air sacs called alveoli in mammals and reptiles, but atria in birds. These microscopic air sacs have a very rich blood supply, thus bringing the air into close contact with the blood.
These air sacs communicate with the external environment via a system of airways, or hollow tubes, of which the largest is the trachea, which branches in the middle of the chest into the two main bronchi. These enter the lungs where they branch into progressively narrower secondary and tertiary bronchi that branch into numerous smaller tubes, the bronchioles.
In birds the bronchioles are termed parabronchi. It is the bronchioles, or parabronchi that generally open into the microscopic alveoli in mammals and atria in birds. Air has to be pumped from the environment into the alveoli or atria by the process of breathing which involves the muscles of respiration.
In most fish, and a number of other aquatic animals (both vertebrates and invertebrates) the respiratory system consists of gills, which are either partially or completely external organs, bathed in the watery environment. This water flows over the gills by a variety of active or passive means. Gas exchange takes place in the gills which consist of thin or very flat filaments and lammelae which expose a very large surface area of highly vascularized tissue to the water.
Other animals, such as insects, have respiratory systems with very simple anatomical features, and in amphibians even the skin plays a vital role in gas exchange. Plants also have respiratory systems but the directionality of gas exchange can be opposite to that in animals. The respiratory system in plants includes anatomical features such as stomata, that are found in various parts of the plant.
Click on any of the following blue hyperlinks for more about Our Respiratory System:
- Mammals
- Exceptional mammals
- Birds
- Reptiles
- Amphibians
- Fish
- Invertebrates
- Plants
- See also:
- Great Oxygenation Event
- Respiratory adaptation
- A high school level description of the respiratory system
- Introduction to Respiratory System
- Science aid: Respiratory System A simple guide for high school students
- The Respiratory System University level (Microsoft Word document)
- Lectures in respiratory physiology by noted respiratory physiologist John B. West (also at YouTube)
The human body is the structure of a human being. It is composed of many different types of cells that together create tissues and subsequently organ systems. They ensure homeostasis and the viability of the human body.
The human body comprises a head, neck, trunk (which includes the thorax and abdomen), arms and hands, legs and feet.
The study of the human body involves anatomy, physiology, histology and embryology. The body varies anatomically in known ways. Physiology focuses on the systems and organs of the human body and their functions.
Many systems and mechanisms interact in order to maintain homeostasis, with safe levels of substances such as sugar and oxygen in the blood.
The body is studied by health professionals, physiologists, anatomists, and by artists to assist them in their work.
Click on any of the following blue hyperlinks for more about the Human Body:
The human body comprises a head, neck, trunk (which includes the thorax and abdomen), arms and hands, legs and feet.
The study of the human body involves anatomy, physiology, histology and embryology. The body varies anatomically in known ways. Physiology focuses on the systems and organs of the human body and their functions.
Many systems and mechanisms interact in order to maintain homeostasis, with safe levels of substances such as sugar and oxygen in the blood.
The body is studied by health professionals, physiologists, anatomists, and by artists to assist them in their work.
Click on any of the following blue hyperlinks for more about the Human Body:
- Composition
- Anatomy
- Physiology
- Development
- Society and culture
- Religions
- See also:
- Medicine – The science and practice of the diagnosis, treatment, and prevention of physical and mental illnesses
- Glossary of medicine
- Body image
- Cell physiology
- Comparative physiology
- Comparative anatomy
- Development of the human body
- The Book of Humans (from the late 18th and early 19th centuries)
- Inner Body
- Anatomia 1522–1867: Anatomical Plates from the Thomas Fisher Rare Book Library
Human Anatomy
- YouTube Video: Human Body 101 | National Geographic
- YouTube Video: Visible Body Human Anatomy Atlas Walk-through
- YouTube Video: Human Body Systems for Kids/Human Anatomy for kids/Human Anatomy Systems for kids
Human anatomy – scientific study of the morphology of the adult human. It is subdivided into gross anatomy and microscopic anatomy.
Gross anatomy (also called topographical anatomy, regional anatomy, or anthropotomy) is the study of anatomical structures that can be seen by unaided vision.
Microscopic anatomy is the study of minute anatomical structures assisted with microscopes, and includes histology (the study of the organization of tissues), and cytology (the study of cells).
Click on any of the following blue hyperlinks for more about Human Anatomy:
Anatomy (Greek anatomē, "dissection") is the branch of biology concerned with the study of the structure of organisms and their parts.
Anatomy is a branch of natural science which deals with the structural organization of living things. It is an old science, having its beginnings in prehistoric times.
Anatomy is inherently tied to the following as these are the processes by which anatomy is generated over immediate (embryology) and long (evolution) timescale:
Anatomy and physiology, which study (respectively) the structure and function of organisms and their parts, make a natural pair of related disciplines, and they are often studied together. Human anatomy (above) is one of the essential basic sciences that are applied in medicine.
The discipline of anatomy is divided into macroscopic and microscopic anatomy.
Macroscopic anatomy, or gross anatomy, is the examination of an animal's body parts using unaided eyesight. Gross anatomy also includes the branch of superficial anatomy.
Microscopic anatomy involves the use of optical instruments in the study of the tissues of various structures, known as histology, and also in the study of cells.
The history of anatomy is characterized by a progressive understanding of the functions of the organs and structures of the human body. Methods have also improved dramatically, advancing from the examination of animals by dissection of carcasses and cadavers (corpses) to 20th century medical imaging techniques including X-ray, ultrasound, and magnetic resonance imaging.
Click on any of the following blue hyperlinks for more about Anatomy:
Gross anatomy (also called topographical anatomy, regional anatomy, or anthropotomy) is the study of anatomical structures that can be seen by unaided vision.
Microscopic anatomy is the study of minute anatomical structures assisted with microscopes, and includes histology (the study of the organization of tissues), and cytology (the study of cells).
Click on any of the following blue hyperlinks for more about Human Anatomy:
- Essence of human anatomy
- Branches of human anatomy
- Anatomy of the human body
- History of human anatomy
- Organizations
- Anatomists
- See also:
Anatomy (Greek anatomē, "dissection") is the branch of biology concerned with the study of the structure of organisms and their parts.
Anatomy is a branch of natural science which deals with the structural organization of living things. It is an old science, having its beginnings in prehistoric times.
Anatomy is inherently tied to the following as these are the processes by which anatomy is generated over immediate (embryology) and long (evolution) timescale:
Anatomy and physiology, which study (respectively) the structure and function of organisms and their parts, make a natural pair of related disciplines, and they are often studied together. Human anatomy (above) is one of the essential basic sciences that are applied in medicine.
The discipline of anatomy is divided into macroscopic and microscopic anatomy.
Macroscopic anatomy, or gross anatomy, is the examination of an animal's body parts using unaided eyesight. Gross anatomy also includes the branch of superficial anatomy.
Microscopic anatomy involves the use of optical instruments in the study of the tissues of various structures, known as histology, and also in the study of cells.
The history of anatomy is characterized by a progressive understanding of the functions of the organs and structures of the human body. Methods have also improved dramatically, advancing from the examination of animals by dissection of carcasses and cadavers (corpses) to 20th century medical imaging techniques including X-ray, ultrasound, and magnetic resonance imaging.
Click on any of the following blue hyperlinks for more about Anatomy:
- Definition
- Animal tissues
- Vertebrate anatomy
- Invertebrate anatomy
- Other branches of anatomy
- History
- See also:
- Anatomy at Curlie
- Anatomy, In Our Time. BBC Radio 4. Melvyn Bragg with guests Ruth Richardson, Andrew Cunningham and Harold Ellis.
- Parsons, Frederick Gymer (1911). "Anatomy" . Encyclopædia Britannica. 1 (11th ed.). pp. 920–943.
- Anatomia Collection: anatomical plates 1522 to 1867 (digitized books and images)
The Human Brain along with Human Behavior and Memory
Pictured below: The human brain and the bodily functions it controls
- Video about the Human Brain: by Allan Jones, TED Talk | TED.com (Click on Arrow to start Video)
- YouTube Video: Julia Shaw – Memory hacking: The science of learning in the 21st Century
- YouTube Video: Science of Persuasion*
Pictured below: The human brain and the bodily functions it controls
The human brain is the central organ of the human nervous system, and with the spinal cord makes up the central nervous system. The brain consists of the cerebrum, the brainstem and the cerebellum.
The brain controls most of the activities of the body, processing, integrating, and coordinating the information it receives from the sense organs, and making decisions as to the instructions sent to the rest of the body. The brain is contained in, and protected by, the skull bones of the head.
The cerebrum is the largest part of the human brain. It is divided into two cerebral hemispheres. The cerebral cortex is an outer layer of grey matter, covering the core of white matter. The cortex is split into the neocortex and the much smaller allocortex. The neocortex is made up of six neuronal layers, while the allocortex has three or four. Each hemisphere is conventionally divided into four lobes – the frontal, temporal, parietal, and occipital lobes.
The frontal lobe is associated with executive functions including self-control, planning, reasoning, and abstract thought, while the occipital lobe is dedicated to vision. Within each lobe, cortical areas are associated with specific functions, such as the sensory, a motor and association regions. Although the left and right hemispheres are broadly similar in shape and function, some functions are associated with one side, such as language in the left and visual-spatial ability in the right. The hemispheres are connected by nerve tracts, the largest being the corpus callosum.
The cerebrum is connected by the brainstem to the spinal cord. The brainstem consists of the midbrain, the pons, and the medulla oblongata. The cerebellum is connected to the brain stem by pairs of tracts.
Within the cerebrum is the ventricular system, consisting of four interconnected ventricles in which cerebrospinal fluid is produced and circulated.
Underneath the cerebral cortex are several important structures, including
The cells of the brain include neurons and supportive glial cells. There are more than 86 billion neurons in the brain, and a more or less equal number of other cells. Brain activity is made possible by the interconnections of neurons and their release of neurotransmitters in response to nerve impulses.
Neurons form elaborate neural networks of neural pathways and circuits. The whole circuitry is driven by the process of neurotransmission.
The brain is protected by the skull, suspended in cerebrospinal fluid, and isolated from the bloodstream by the blood–brain barrier. However, the brain is still susceptible to damage, disease, and infection. Damage can be caused by trauma, or a loss of blood supply known as a stroke.
The brain is also susceptible to degenerative disorders, such as Parkinson's disease, dementias including Alzheimer's disease, and multiple sclerosis. Psychiatric conditions, including schizophrenia and clinical depression, are thought to be associated with brain dysfunctions.
The brain can also be the site of tumours, both benign and malignant; these last mostly originate from other sites in the body. The study of the anatomy of the brain is neuroanatomy, while the study of its function is neuroscience.
A number of techniques are used to study the brain. Specimens from other animals, which may be examined microscopically, have traditionally provided much information. Medical imaging technologies such as functional neuroimaging, and electroencephalography (EEG) recordings are important in studying the brain.
The medical history of people with brain injury has provided insight into the function of each part of the brain.
In culture, the philosophy of mind has for centuries attempted to address the question of the nature of consciousness and the mind-body problem.
The pseudoscience of phrenology attempted to localize personality attributes to regions of the cortex in the 19th century. In science fiction, brain transplants are imagined in tales such as the 1942 Donovan's Brain.
Click on any of the following blue hyperlinks for more about the Human Brain:
Human behavior refers to the array of every physical action and observable emotion associated with individuals, as well as the human race. While specific traits of one's personality and temperament may be more consistent, other behaviors will change as one moves from birth through adulthood.
In addition to being dictated by age and genetics, behavior, driven in part by thoughts and feelings, is an insight into individual psyche, revealing among other things attitudes and values. Social behavior, a subset of human behavior, study the considerable influence of social interaction and culture. Additional influences include ethics, encircling, authority, rapport, hypnosis, persuasion and coercion.
The behavior of humans (and other organisms or even mechanisms) falls within a range with some behavior being common, some unusual, some acceptable, and some beyond acceptable limits.
In sociology, behavior in general includes actions having no meaning, being not directed at other people, and thus all basic human actions. Behavior in this general sense should not be mistaken with social behavior, which is a more advanced social action, specifically directed at other people.
The acceptability of behavior depends heavily upon social norms and is regulated by various means of social control. Human behavior is studied by the specialized academic disciplines including:
Human behavior is experienced throughout an individual’s entire lifetime. It includes the way they act based on different factors such as genetics, social norms, core faith, and attitude.
Behavior is impacted by certain traits each individual has. The traits vary from person to person and can produce different actions or behavior from each person. Social norms also impact behavior.
Due to the inherently conformist nature of human society in general, humans are pressured into following certain rules and displaying certain behaviors in society, which conditions the way people behave.
Different behaviors are deemed to be either acceptable or unacceptable in different societies and cultures. Core faith can be perceived through the religion and philosophy of that individual. It shapes the way a person thinks and this in turn results in different human behaviors.
Attitude can be defined as "the degree to which the person has a favorable or unfavorable evaluation of the behavior in question." One's attitude is essentially a reflection of the behavior he or she will portray in specific situations. Thus, human behavior is greatly influenced by the attitudes we use on a daily basis.
Click on any of the following blue hyperlinks for more about Human Behavior:
Human Memory:
Memory is the faculty of the brain by which data or information is encoded, stored, and retrieved when needed. It is the retention of information over time for the purpose of influencing future action. If past events could not be remembered, it would be impossible for language, relationships, or personal identity to develop. Memory loss is usually described as forgetfulness or amnesia.
Memory is often understood as an informational processing system with explicit and implicit functioning that is made up of a sensory processor, short-term (or working) memory, and long-term memory. This can be related to the neuron. The sensory processor allows information from the outside world to be sensed in the form of chemical and physical stimuli and attended to various levels of focus and intent.
Working memory serves as an encoding and retrieval processor. Information in the form of stimuli is encoded in accordance with explicit or implicit functions by the working memory processor. The working memory also retrieves information from previously stored material.
Finally, the function of long-term memory is to store data through various categorical models or systems.
Declarative, or explicit, memory is the conscious storage and recollection of data. Under declarative memory resides semantic and episodic memory. Semantic memory refers to memory that is encoded with specific meaning, while episodic memory refers to information that is encoded along a spatial and temporal plane.
Declarative memory is usually the primary process thought of when referencing memory. Non-declarative, or implicit, memory is the unconscious storage and recollection of information.
An example of a non-declarative process would be the unconscious learning or retrieval of information by way of procedural memory, or a priming phenomenon. Priming is the process of subliminally arousing specific responses from memory and shows that not all memory is consciously activated, whereas procedural memory is the slow and gradual learning of skills that often occurs without conscious attention to learning.
Memory is not a perfect processor, and is affected by many factors. The ways by which information is encoded, stored, and retrieved can all be corrupted. The amount of attention given new stimuli can diminish the amount of information that becomes encoded for storage.
Also, the storage process can become corrupted by physical damage to areas of the brain that are associated with memory storage, such as the hippocampus. Finally, the retrieval of information from long-term memory can be disrupted because of decay within long-term memory. Normal functioning, decay over time, and brain damage all affect the accuracy and capacity of the memory.
Click on any of the following blue hyperlinks for more about Human Memory:
The brain controls most of the activities of the body, processing, integrating, and coordinating the information it receives from the sense organs, and making decisions as to the instructions sent to the rest of the body. The brain is contained in, and protected by, the skull bones of the head.
The cerebrum is the largest part of the human brain. It is divided into two cerebral hemispheres. The cerebral cortex is an outer layer of grey matter, covering the core of white matter. The cortex is split into the neocortex and the much smaller allocortex. The neocortex is made up of six neuronal layers, while the allocortex has three or four. Each hemisphere is conventionally divided into four lobes – the frontal, temporal, parietal, and occipital lobes.
The frontal lobe is associated with executive functions including self-control, planning, reasoning, and abstract thought, while the occipital lobe is dedicated to vision. Within each lobe, cortical areas are associated with specific functions, such as the sensory, a motor and association regions. Although the left and right hemispheres are broadly similar in shape and function, some functions are associated with one side, such as language in the left and visual-spatial ability in the right. The hemispheres are connected by nerve tracts, the largest being the corpus callosum.
The cerebrum is connected by the brainstem to the spinal cord. The brainstem consists of the midbrain, the pons, and the medulla oblongata. The cerebellum is connected to the brain stem by pairs of tracts.
Within the cerebrum is the ventricular system, consisting of four interconnected ventricles in which cerebrospinal fluid is produced and circulated.
Underneath the cerebral cortex are several important structures, including
- the thalamus,
- the epithalamus,
- the pineal gland,
- the hypothalamus,
- the pituitary gland,
- the subthalamus;
- the limbic structures, including
- the amygdala and the hippocampus;
- the claustrum,
- the various nuclei of the basal ganglia;
- and the basal forebrain structures, and the three circumventricular organs.
The cells of the brain include neurons and supportive glial cells. There are more than 86 billion neurons in the brain, and a more or less equal number of other cells. Brain activity is made possible by the interconnections of neurons and their release of neurotransmitters in response to nerve impulses.
Neurons form elaborate neural networks of neural pathways and circuits. The whole circuitry is driven by the process of neurotransmission.
The brain is protected by the skull, suspended in cerebrospinal fluid, and isolated from the bloodstream by the blood–brain barrier. However, the brain is still susceptible to damage, disease, and infection. Damage can be caused by trauma, or a loss of blood supply known as a stroke.
The brain is also susceptible to degenerative disorders, such as Parkinson's disease, dementias including Alzheimer's disease, and multiple sclerosis. Psychiatric conditions, including schizophrenia and clinical depression, are thought to be associated with brain dysfunctions.
The brain can also be the site of tumours, both benign and malignant; these last mostly originate from other sites in the body. The study of the anatomy of the brain is neuroanatomy, while the study of its function is neuroscience.
A number of techniques are used to study the brain. Specimens from other animals, which may be examined microscopically, have traditionally provided much information. Medical imaging technologies such as functional neuroimaging, and electroencephalography (EEG) recordings are important in studying the brain.
The medical history of people with brain injury has provided insight into the function of each part of the brain.
In culture, the philosophy of mind has for centuries attempted to address the question of the nature of consciousness and the mind-body problem.
The pseudoscience of phrenology attempted to localize personality attributes to regions of the cortex in the 19th century. In science fiction, brain transplants are imagined in tales such as the 1942 Donovan's Brain.
Click on any of the following blue hyperlinks for more about the Human Brain:
- Structure
- Gross anatomy
Microanatomy
Cerebrospinal fluid
Blood supply
- Gross anatomy
- Development
- Function
- Motor control
Sensory
Regulation
Language
Lateralisation
Emotion
Cognition
- Motor control
- Physiology
- Neurotransmission
Metabolism
- Neurotransmission
- Research
- Methods
Imaging
- Methods
- Clinical significance
- Society and culture
- The mind
Brain size
In popular culture
- The mind
- History
- Comparative anatomy
- See also:
Human behavior refers to the array of every physical action and observable emotion associated with individuals, as well as the human race. While specific traits of one's personality and temperament may be more consistent, other behaviors will change as one moves from birth through adulthood.
In addition to being dictated by age and genetics, behavior, driven in part by thoughts and feelings, is an insight into individual psyche, revealing among other things attitudes and values. Social behavior, a subset of human behavior, study the considerable influence of social interaction and culture. Additional influences include ethics, encircling, authority, rapport, hypnosis, persuasion and coercion.
The behavior of humans (and other organisms or even mechanisms) falls within a range with some behavior being common, some unusual, some acceptable, and some beyond acceptable limits.
In sociology, behavior in general includes actions having no meaning, being not directed at other people, and thus all basic human actions. Behavior in this general sense should not be mistaken with social behavior, which is a more advanced social action, specifically directed at other people.
The acceptability of behavior depends heavily upon social norms and is regulated by various means of social control. Human behavior is studied by the specialized academic disciplines including:
Human behavior is experienced throughout an individual’s entire lifetime. It includes the way they act based on different factors such as genetics, social norms, core faith, and attitude.
Behavior is impacted by certain traits each individual has. The traits vary from person to person and can produce different actions or behavior from each person. Social norms also impact behavior.
Due to the inherently conformist nature of human society in general, humans are pressured into following certain rules and displaying certain behaviors in society, which conditions the way people behave.
Different behaviors are deemed to be either acceptable or unacceptable in different societies and cultures. Core faith can be perceived through the religion and philosophy of that individual. It shapes the way a person thinks and this in turn results in different human behaviors.
Attitude can be defined as "the degree to which the person has a favorable or unfavorable evaluation of the behavior in question." One's attitude is essentially a reflection of the behavior he or she will portray in specific situations. Thus, human behavior is greatly influenced by the attitudes we use on a daily basis.
Click on any of the following blue hyperlinks for more about Human Behavior:
- Factors
- See also:
Human Memory:
Memory is the faculty of the brain by which data or information is encoded, stored, and retrieved when needed. It is the retention of information over time for the purpose of influencing future action. If past events could not be remembered, it would be impossible for language, relationships, or personal identity to develop. Memory loss is usually described as forgetfulness or amnesia.
Memory is often understood as an informational processing system with explicit and implicit functioning that is made up of a sensory processor, short-term (or working) memory, and long-term memory. This can be related to the neuron. The sensory processor allows information from the outside world to be sensed in the form of chemical and physical stimuli and attended to various levels of focus and intent.
Working memory serves as an encoding and retrieval processor. Information in the form of stimuli is encoded in accordance with explicit or implicit functions by the working memory processor. The working memory also retrieves information from previously stored material.
Finally, the function of long-term memory is to store data through various categorical models or systems.
Declarative, or explicit, memory is the conscious storage and recollection of data. Under declarative memory resides semantic and episodic memory. Semantic memory refers to memory that is encoded with specific meaning, while episodic memory refers to information that is encoded along a spatial and temporal plane.
Declarative memory is usually the primary process thought of when referencing memory. Non-declarative, or implicit, memory is the unconscious storage and recollection of information.
An example of a non-declarative process would be the unconscious learning or retrieval of information by way of procedural memory, or a priming phenomenon. Priming is the process of subliminally arousing specific responses from memory and shows that not all memory is consciously activated, whereas procedural memory is the slow and gradual learning of skills that often occurs without conscious attention to learning.
Memory is not a perfect processor, and is affected by many factors. The ways by which information is encoded, stored, and retrieved can all be corrupted. The amount of attention given new stimuli can diminish the amount of information that becomes encoded for storage.
Also, the storage process can become corrupted by physical damage to areas of the brain that are associated with memory storage, such as the hippocampus. Finally, the retrieval of information from long-term memory can be disrupted because of decay within long-term memory. Normal functioning, decay over time, and brain damage all affect the accuracy and capacity of the memory.
Click on any of the following blue hyperlinks for more about Human Memory:
- Sensory memory
- Short-term memory
- Long-term memory
- Types
- Study techniques
- Failures
- Physiology
- Cognitive neuroscience
- Genetics
- In infancy
- Aging
- Disorders
- Influencing factors
- Stress
- Sleep
- Construction for general manipulation
- Improving
- In plants
- See also:
- Prenatal memory
- Adaptive memory
- Animal memory
- Collective memory
- False memory
- Intermediate-term memory
- Involuntary memory
- Method of loci
- Mnemonic major system
- Photographic memory
- Politics of memory
- Zalta, Edward N. (ed.). "Memory". Stanford Encyclopedia of Philosophy.
- Memory at PhilPapers
- Memory at the Indiana Philosophy Ontology Project
- Memory on In Our Time at the BBC
- Memory-related resources from the National Institutes of Health
- On the Seven Sins of Memory with Professor Daniel Schacter
- 'Bridging the Gaps: A Portal for Curious Minds'
Development of the Human Body, including Implantable Technology (IEEE 1/1/2019 Article) Pictured below: Many recent advances in implantable devices not so long ago would have been strictly in the domain of science fiction. At the same time, the public remains mystified, if not conflicted, about implantable technologies. Rising awareness about social issues related to implantable devices requires further exploration.
[Your Webhost: we've included two sub-topics under "Development of the Human Body": First, an article published in the January 1, 2019 issue of the IEEE publication "Technology and Society" By Robert Sobot in Editorial & Opinion, Human Impacts, Magazine Articles:
"Robotics, Social Implications of Technology, Societal Impact"
this article covers the many implantable body parts enabling greater quality of life by those who are handicapped (in one form or another per above picture). For example, I have a heart pacemaker.
The second sub-topic follows below]:
___________________________________________________________________________
Development of the human body is the process of growth to maturity. The process begins with fertilization, where an egg released from the ovary of a female is penetrated by a sperm cell from a male. The resulting zygote develops through mitosis and cell differentiation, and the resulting embryo then implants in the uterus, where the embryo continues development through a fetal stage until birth.
Further growth and development continues after birth, and includes both physical and psychological development, influenced by genetic, hormonal, environmental and other factors. This continues throughout life: through childhood and adolescence into adulthood.
Click on any of the following blue hyperlinks for more about Development of the Human Body:
"Robotics, Social Implications of Technology, Societal Impact"
this article covers the many implantable body parts enabling greater quality of life by those who are handicapped (in one form or another per above picture). For example, I have a heart pacemaker.
The second sub-topic follows below]:
___________________________________________________________________________
Development of the human body is the process of growth to maturity. The process begins with fertilization, where an egg released from the ovary of a female is penetrated by a sperm cell from a male. The resulting zygote develops through mitosis and cell differentiation, and the resulting embryo then implants in the uterus, where the embryo continues development through a fetal stage until birth.
Further growth and development continues after birth, and includes both physical and psychological development, influenced by genetic, hormonal, environmental and other factors. This continues throughout life: through childhood and adolescence into adulthood.
Click on any of the following blue hyperlinks for more about Development of the Human Body:
Coaching vs. Mentoring
- YouTube Video: What is the Difference between Mentoring and Coaching?
- YouTube Video: Top 10 Coaching Mistakes
- YouTube Video: The power of mentoring: Lori Hunt at TEDxCCS
Coaching is a form of development in which an experienced person, called a coach, supports a learner or client in achieving a specific personal or professional goal by providing training and guidance.
The learner is sometimes called a student. Occasionally, coaching may mean an informal relationship between two people, of whom one has more experience and expertise than the other and offers advice and guidance as the latter learns; but coaching differs from mentoring by focusing on specific tasks or objectives, as opposed to more general goals or overall development.
Click on any of the following blue hyperlinks for more about Coaching: ___________________________________________________________________________
Mentorship is a relationship in which a more experienced or more knowledgeable person helps to guide a less experienced or less knowledgeable person. The mentor may be older or younger than the person being mentored, but he or she must have a certain area of expertise.
It is a learning and development partnership between someone with vast experience and someone who wants to learn. Interaction with an expert may also be necessary to gain proficiency with/in cultural tools. Mentorship experience and relationship structure affect the "amount of psychosocial support, career guidance, role modeling, and communication that occurs in the mentoring relationships in which the protégés and mentors engaged."
The person in receipt of mentorship may be referred to as a protégé (male), a protégée (female), an apprentice or, in the 2000s, a mentee. The mentor may be referred to as a godfather or godmother.
"Mentoring" is a process that always involves communication and is relationship-based, but its precise definition is elusive, with more than 50 definitions currently in use. One definition of the many that have been proposed, is the following:
"Mentoring is a process for the informal transmission of knowledge, social capital, and the psychosocial support perceived by the recipient as relevant to work, career, or professional development; mentoring entails informal communication, usually face-to-face and during a sustained period of time, between a person who is perceived to have greater relevant knowledge, wisdom, or experience (the mentor) and a person who is perceived to have less (the protégé)".
Mentoring in Europe has existed since at least Ancient Greek times, and roots of the word go to Mentor, son of Alcimus in Homer's Odyssey. Since the 1970s it has spread in the United States mainly in training contexts, with important historical links to the movement advancing workplace equity for women and minorities, and it has been described as "an innovation in American management".
Click on any of the following blue hyperlinks for more about Mentoring:
The learner is sometimes called a student. Occasionally, coaching may mean an informal relationship between two people, of whom one has more experience and expertise than the other and offers advice and guidance as the latter learns; but coaching differs from mentoring by focusing on specific tasks or objectives, as opposed to more general goals or overall development.
Click on any of the following blue hyperlinks for more about Coaching: ___________________________________________________________________________
Mentorship is a relationship in which a more experienced or more knowledgeable person helps to guide a less experienced or less knowledgeable person. The mentor may be older or younger than the person being mentored, but he or she must have a certain area of expertise.
It is a learning and development partnership between someone with vast experience and someone who wants to learn. Interaction with an expert may also be necessary to gain proficiency with/in cultural tools. Mentorship experience and relationship structure affect the "amount of psychosocial support, career guidance, role modeling, and communication that occurs in the mentoring relationships in which the protégés and mentors engaged."
The person in receipt of mentorship may be referred to as a protégé (male), a protégée (female), an apprentice or, in the 2000s, a mentee. The mentor may be referred to as a godfather or godmother.
"Mentoring" is a process that always involves communication and is relationship-based, but its precise definition is elusive, with more than 50 definitions currently in use. One definition of the many that have been proposed, is the following:
"Mentoring is a process for the informal transmission of knowledge, social capital, and the psychosocial support perceived by the recipient as relevant to work, career, or professional development; mentoring entails informal communication, usually face-to-face and during a sustained period of time, between a person who is perceived to have greater relevant knowledge, wisdom, or experience (the mentor) and a person who is perceived to have less (the protégé)".
Mentoring in Europe has existed since at least Ancient Greek times, and roots of the word go to Mentor, son of Alcimus in Homer's Odyssey. Since the 1970s it has spread in the United States mainly in training contexts, with important historical links to the movement advancing workplace equity for women and minorities, and it has been described as "an innovation in American management".
Click on any of the following blue hyperlinks for more about Mentoring:
- Historical
- Professional bodies and qualifications
- Techniques
- Types
- Benefits
- Contemporary research and practice in the US
- Corporate programs
- Matching approaches
- In education
- Blended mentoring
- Reverse mentoring
- Business mentoring
- See also:
I have chained together the following 3 Topics as reasoning methods that people interactively use to form opinion, then to make a decision.
Common Sense vs. Intuition vs. Reasoning
YouTube Video: 3 ways to make better decisions -- by thinking like a computer | Tom Griffiths TED
Pictured: A diagram representing the constitution of the United States as proposed by Thomas Paine in his publication "Common Sense" (1776)
Common Sense vs. Intuition vs. Reasoning
YouTube Video: 3 ways to make better decisions -- by thinking like a computer | Tom Griffiths TED
Pictured: A diagram representing the constitution of the United States as proposed by Thomas Paine in his publication "Common Sense" (1776)
Common sense is a basic ability to perceive, understand, and judge things that are shared by ("common to") nearly all people and can reasonably be expected of nearly all people without need for debate.
The everyday understanding of common sense derives from philosophical discussion involving several European languages. Related terms in other languages include Latin sensus communis, Greek κοινὴ αἴσθησις (koinē aísthēsis), and French bon sens, but these are not straightforward translations in all contexts. Similarly in English, there are different shades of meaning, implying more or less education and wisdom: "good sense" is sometimes seen as equivalent to "common sense", and sometimes not.
"Common sense" has at least two specifically philosophical meanings. One is a capability of the animal soul (Greek psukhē) proposed by Aristotle, which enables different individual senses to collectively perceive the characteristics of physical things such as movement and size, which all physical things have in different combinations, allowing people and other animals to distinguish and identify physical things.
This common sense is distinct from basic sensory perception and from human rational thinking, but cooperates with both. The second special use of the term is Roman-influenced and is used for the natural human sensitivity for other humans and the community.
Just like the everyday meaning, both of these refer to a type of basic awareness and ability to judge that most people are expected to share naturally, even if they can not explain why.
All these meanings of "common sense", including the everyday one, are inter-connected in a complex history and have evolved during important political and philosophical debates in modern western civilization, notably concerning science, politics and economics.
The interplay between the meanings has come to be particularly notable in English, as opposed to other western European languages, and the English term has become international.
In modern times the term "common sense" has frequently been used for rhetorical effect, sometimes pejorative, and sometimes appealed to positively, as an authority. It can be negatively equated to vulgar prejudice and superstition, or on the contrary it is often positively contrasted to them as a standard for good taste and as the source of the most basic axioms needed for science and logic.
It was at the beginning of the eighteenth century that this old philosophical term first acquired its modern English meaning: “Those plain, self-evident truths or conventional wisdom that one needed no sophistication to grasp and no proof to accept precisely because they accorded so well with the basic (common sense) intellectual capacities and experiences of the whole social body".
This began with Descartes' criticism of it, and what came to be known as the dispute between "rationalism" and "empiricism". In the opening line of one of his most famous books, Discourse on Method, Descartes established the most common modern meaning, and its controversies, when he stated that everyone has a similar and sufficient amount of common sense (bon sens), but it is rarely used well.
Therefore, a skeptical logical method described by Descartes needs to be followed and common sense should not be overly relied upon. In the ensuing 18th century Enlightenment, common sense came to be seen more positively as the basis for modern thinking. It was contrasted to metaphysics, which was, like Cartesianism, associated with the ancien régime.
Thomas Paine's polemical pamphlet Common Sense (1776) has been described as the most influential political pamphlet of the 18th century, affecting both the American and French revolutions. Today, the concept of common sense, and how it should best be used, remains linked to many of the most perennial topics in epistemology and ethics, with special focus often directed at the philosophy of the modern social sciences.
Click on any of the following blue hyperlinks for more about Common Sense:
Intuition is the ability to acquire knowledge without proof, evidence, or conscious reasoning, or without understanding how the knowledge was acquired.]
Different writers give the word "intuition" a great variety of different meanings, ranging from direct access to unconscious knowledge, unconscious cognition, inner sensing, inner insight to unconscious pattern-recognition and the ability to understand something instinctively, without the need for conscious reasoning.
There are philosophers who contend that the word "intuition" is often misunderstood or misused to mean instinct, truth, belief, meaning but rather realms of greater knowledge and other subjects, whereas others contend that faculties such as instinct, belief and intuition are factually related.
The word intuition comes from the Latin verb intueri translated as "consider" or from the late middle English word intuit, "to contemplate".Intuition Peak on Livingston Island in the South Shetland Islands, Antarctica is named in appreciation of the role of scientific intuition for the advancement of human knowledge.
Click on and of the following blue hyperlinks for more about Intuition:
Reasoning:
Reason is the capacity for consciously making sense of things, applying logic, establishing and verifying facts, and changing or justifying practices, institutions, and beliefs based on new or existing information.
It is closely associated with such characteristically human activities as philosophy, science, language, mathematics, and art and is normally considered to be a definitive characteristic of human nature. Reason, or an aspect of it, is sometimes referred to as rationality.
Reasoning is associated with thinking, cognition, and intellect. Reasoning may be subdivided into forms of logical reasoning (forms associated with the strict sense):
Along these lines, a distinction is often drawn between discursive reason, reason proper, and intuitive reason, in which the reasoning process—however valid—tends toward the personal and the opaque.
Although in many social and political settings logical and intuitive modes of reason may clash, in other contexts intuition and formal reason are seen as complementary, rather than adversarial as, for example, in mathematics, where intuition is often a necessary building block in the creative process of achieving the hardest form of reason, a formal proof.
Reason, like habit or intuition, is one of the ways by which thinking comes from one idea to a related idea. For example, it is the means by which rational beings understand themselves to think about cause and effect, truth and falsehood, and what is good or bad. It is also closely identified with the ability to self-consciously change beliefs, attitudes, traditions, and institutions, and therefore with the capacity for freedom and self-determination.
In contrast to reason as an abstract noun, a reason is a consideration which explains or justifies some event, phenomenon, or behavior. The field of logic studies ways in which human beings reason formally through argument.
Psychologists and cognitive scientists have attempted to study and explain how people reason, e.g. which cognitive and neural processes are engaged, and how cultural factors affect the inferences that people draw. The field of automated reasoning studies how reasoning may or may not be modeled computationally. Animal psychology considers the question of whether animals other than humans can reason.
Click on any of the following blue hyperlinks for more about Reasoning:
The everyday understanding of common sense derives from philosophical discussion involving several European languages. Related terms in other languages include Latin sensus communis, Greek κοινὴ αἴσθησις (koinē aísthēsis), and French bon sens, but these are not straightforward translations in all contexts. Similarly in English, there are different shades of meaning, implying more or less education and wisdom: "good sense" is sometimes seen as equivalent to "common sense", and sometimes not.
"Common sense" has at least two specifically philosophical meanings. One is a capability of the animal soul (Greek psukhē) proposed by Aristotle, which enables different individual senses to collectively perceive the characteristics of physical things such as movement and size, which all physical things have in different combinations, allowing people and other animals to distinguish and identify physical things.
This common sense is distinct from basic sensory perception and from human rational thinking, but cooperates with both. The second special use of the term is Roman-influenced and is used for the natural human sensitivity for other humans and the community.
Just like the everyday meaning, both of these refer to a type of basic awareness and ability to judge that most people are expected to share naturally, even if they can not explain why.
All these meanings of "common sense", including the everyday one, are inter-connected in a complex history and have evolved during important political and philosophical debates in modern western civilization, notably concerning science, politics and economics.
The interplay between the meanings has come to be particularly notable in English, as opposed to other western European languages, and the English term has become international.
In modern times the term "common sense" has frequently been used for rhetorical effect, sometimes pejorative, and sometimes appealed to positively, as an authority. It can be negatively equated to vulgar prejudice and superstition, or on the contrary it is often positively contrasted to them as a standard for good taste and as the source of the most basic axioms needed for science and logic.
It was at the beginning of the eighteenth century that this old philosophical term first acquired its modern English meaning: “Those plain, self-evident truths or conventional wisdom that one needed no sophistication to grasp and no proof to accept precisely because they accorded so well with the basic (common sense) intellectual capacities and experiences of the whole social body".
This began with Descartes' criticism of it, and what came to be known as the dispute between "rationalism" and "empiricism". In the opening line of one of his most famous books, Discourse on Method, Descartes established the most common modern meaning, and its controversies, when he stated that everyone has a similar and sufficient amount of common sense (bon sens), but it is rarely used well.
Therefore, a skeptical logical method described by Descartes needs to be followed and common sense should not be overly relied upon. In the ensuing 18th century Enlightenment, common sense came to be seen more positively as the basis for modern thinking. It was contrasted to metaphysics, which was, like Cartesianism, associated with the ancien régime.
Thomas Paine's polemical pamphlet Common Sense (1776) has been described as the most influential political pamphlet of the 18th century, affecting both the American and French revolutions. Today, the concept of common sense, and how it should best be used, remains linked to many of the most perennial topics in epistemology and ethics, with special focus often directed at the philosophy of the modern social sciences.
Click on any of the following blue hyperlinks for more about Common Sense:
- Aristotelian
- Roman
- Cartesian
- The Enlightenment after Descartes
- Kant: In aesthetic taste
- Contemporary philosophy
- Catholic theology
- Projects
- See also:
- Counterintuitive
- Appeal to tradition
- A priori knowledge
- Basic belief
- Common knowledge
- Common sense response to the Diallelus problem
- Commonsense reasoning (in Artificial intelligence)
- Doxa
- Endoxa
- Folk wisdom
- Knowledge
- Norm (sociology)
- Ordinary language philosophy
- Phronesis, practical wisdom
- Pre-theoretic belief
- Public opinion
- Rational choice theory
Intuition is the ability to acquire knowledge without proof, evidence, or conscious reasoning, or without understanding how the knowledge was acquired.]
Different writers give the word "intuition" a great variety of different meanings, ranging from direct access to unconscious knowledge, unconscious cognition, inner sensing, inner insight to unconscious pattern-recognition and the ability to understand something instinctively, without the need for conscious reasoning.
There are philosophers who contend that the word "intuition" is often misunderstood or misused to mean instinct, truth, belief, meaning but rather realms of greater knowledge and other subjects, whereas others contend that faculties such as instinct, belief and intuition are factually related.
The word intuition comes from the Latin verb intueri translated as "consider" or from the late middle English word intuit, "to contemplate".Intuition Peak on Livingston Island in the South Shetland Islands, Antarctica is named in appreciation of the role of scientific intuition for the advancement of human knowledge.
Click on and of the following blue hyperlinks for more about Intuition:
- Philosophy
- Psychology
- Eastern philosophy
- Hinduism
- Buddhism
- Islam
- Western philosophy
- Freud
- Jung
- Modern psychology
- Colloquial usage
- See also:
- Artistic inspiration
- Brainstorming
- Common sense
- Cognition
- Cryptesthesia
- Déjà vu
- Extra-sensory perception
- Focusing
- Inner Relationship Focusing
- Grok
- Insight
- Instinct
- Intuition and decision-making
- Intuition pump
- Intuitionism
- Intelligence analysis#Trained intuition
- List of thought processes
- Medical intuitive
- Morphic resonance
- Nous
- Phenomenology (philosophy)
- Precognition
- Preconscious
- Rapport
- Religious experience
- Remote viewing
- Serendipity
- Social intuitionism
- Subconscious
- Synchronicity
- Tacit knowledge
- Truthiness
- Unconscious mind
Reasoning:
Reason is the capacity for consciously making sense of things, applying logic, establishing and verifying facts, and changing or justifying practices, institutions, and beliefs based on new or existing information.
It is closely associated with such characteristically human activities as philosophy, science, language, mathematics, and art and is normally considered to be a definitive characteristic of human nature. Reason, or an aspect of it, is sometimes referred to as rationality.
Reasoning is associated with thinking, cognition, and intellect. Reasoning may be subdivided into forms of logical reasoning (forms associated with the strict sense):
- deductive reasoning,
- inductive reasoning,
- abductive reasoning;
- and other modes of reasoning considered more informal, such as:
Along these lines, a distinction is often drawn between discursive reason, reason proper, and intuitive reason, in which the reasoning process—however valid—tends toward the personal and the opaque.
Although in many social and political settings logical and intuitive modes of reason may clash, in other contexts intuition and formal reason are seen as complementary, rather than adversarial as, for example, in mathematics, where intuition is often a necessary building block in the creative process of achieving the hardest form of reason, a formal proof.
Reason, like habit or intuition, is one of the ways by which thinking comes from one idea to a related idea. For example, it is the means by which rational beings understand themselves to think about cause and effect, truth and falsehood, and what is good or bad. It is also closely identified with the ability to self-consciously change beliefs, attitudes, traditions, and institutions, and therefore with the capacity for freedom and self-determination.
In contrast to reason as an abstract noun, a reason is a consideration which explains or justifies some event, phenomenon, or behavior. The field of logic studies ways in which human beings reason formally through argument.
Psychologists and cognitive scientists have attempted to study and explain how people reason, e.g. which cognitive and neural processes are engaged, and how cultural factors affect the inferences that people draw. The field of automated reasoning studies how reasoning may or may not be modeled computationally. Animal psychology considers the question of whether animals other than humans can reason.
Click on any of the following blue hyperlinks for more about Reasoning:
- Etymology and related words
- Philosophical history
- Reason compared to related concepts
- Traditional problems raised concerning reason
- Reason in particular fields of study
- See also:
- Confirmation bias
- Conformity
- Logic and rationality
- Outline of thought - topic tree that identifies many types of thoughts/thinking, types of reasoning, aspects of thought, related fields, and more.
- Outline of human intelligence - topic tree presenting the traits, capacities, models, and research fields of human intelligence, and more.
Idealism vs. Pragmatism including Practical Idealism
YouTube Video: Pragmatic Idealism Andy Posner at TEDxProvidence
Pictured Below: Top as Example of Idealism; Bottom as Example of a Pragmatist
YouTube Video: Pragmatic Idealism Andy Posner at TEDxProvidence
Pictured Below: Top as Example of Idealism; Bottom as Example of a Pragmatist
In philosophy, idealism is the group of philosophies which assert that reality, or reality as we can know it, is fundamentally mental, mentally constructed, or otherwise immaterial. Epistemologically, idealism manifests as a skepticism about the possibility of knowing any mind-independent thing.
In a sociological sense, idealism emphasizes how human ideas—especially beliefs and values—shape society. As an ontological doctrine, idealism goes further, asserting that all entities are composed of mind or spirit. Idealism thus rejects physicalist and dualist theories that fail to ascribe priority to the mind.
The earliest extant arguments that the world of experience is grounded in the mental derive from India and Greece. The Hindu idealists in India and the Greek Neoplatonists gave panentheistic arguments for an all-pervading consciousness as the ground or true nature of reality.
In contrast, the Yogācāra school, which arose within Mahayana Buddhism in India in the 4th century CE, based its "mind-only" idealism to a greater extent on phenomenological analyses of personal experience. This turn toward the subjective anticipated empiricists such as George Berkeley, who revived idealism in 18th-century Europe by employing skeptical arguments against materialism.
Beginning with Immanuel Kant, German idealists such as G. W. F. Hegel, Johann Gottlieb Fichte, Friedrich Wilhelm Joseph Schelling, and Arthur Schopenhauer dominated 19th-century philosophy. This tradition, which emphasized the mental or "ideal" character of all phenomena, gave birth to idealistic and subjectivist schools ranging from British idealism to phenomenalism to existentialism.
The historical influence of this branch of idealism remains central even to the schools that rejected its metaphysical assumptions, such as Marxism, pragmatism and positivism.
Click on any of the following blue hyperlinks for more about Idealism:
Pragmatism is a philosophical tradition that began in the United States around 1870. Its origins are often attributed to the philosophers William James, John Dewey, and Charles Sanders Peirce. Peirce later described it in his pragmatic maxim: "Consider the practical effects of the objects of your conception. Then, your conception of those effects is the whole of your conception of the object."
Pragmatism considers thought an instrument or tool for prediction, problem solving and action, and rejects the idea that the function of thought is to describe, represent, or mirror reality. Pragmatists contend that most philosophical topics—such as the nature of knowledge, language, concepts, meaning, belief, and science—are all best viewed in terms of their practical uses and successes.
The philosophy of pragmatism "emphasizes the practical application of ideas by acting on them to actually test them in human experiences". Pragmatism focuses on a "changing universe rather than an unchanging one as the Idealists, Realists and Thomists had claimed"
Click on any of the following blue hyperlinks for more about Pragmatism:
Practical idealism is a term first used by John Dewey in 1917 and subsequently adopted by Mahatma Gandhi (Gandhi Marg 2002). It describes a philosophy that holds it to be an ethical imperative to implement ideals of virtue or good. It further holds it to be equally immoral to either refuse to make the compromises necessary to realise high ideals, or to discard ideals in the name of expediency.
Practical idealism in its broadest sense may be compared to utilitarianism in its emphasis on outcomes, and to political economy and enlightened self-interest in its emphasis on the alignment of what is right with what is possible.
International Affairs:
In foreign policy and international relations, the phrase "practical idealism" has come to be taken as a theory or set of principles that diplomats or politicians use to describe or publicize their outlook on foreign policy. It purports to be a pragmatic compromise between realism, which stresses the promotion of a state's "narrow" and amoral self-interest, and idealism, which aims to use the state's influence and power to promote higher liberal ideals such as peace, justice, and co-operation between nations.
In this view, realism is seen as a prescription for Machiavellian selfishness and ruthlessness in international relations. Machiavelli recommended political strategies for reigning, or potential, princes; the infamous teachings gravitate around his vision of the overarching and ultimate goal of any Prince, remaining in power.
These strategies range from those that, today, might be called moderate or liberal political advice to those that, today, might be called illegal, immoral or, in the U.S., unconstitutional. For better or worse, Machiavelli is by name, like novelist George Orwell, modernly associated with manipulative acts and philosophies that disregard civil rights and basic human dignity in favor of deception, intimidation, and coercion.
This extreme form of realism is sometimes considered both unbecoming of nations' aspirations and, ultimately, morally and spiritually unsatisfying for their individual people. Extreme idealism, on the other hand, is associated with moralist naiveté and the failure to prioritize the interests of one's state above other goals.
More recently, practical idealism has been advocated by United States Secretary of State Condoleezza Rice and Philip D. Zelikow, in the position of counselor to the department. The latter has defended the foreign policy of the George W. Bush administration as being "motivated in good part by ideals that transcend narrow conceptions of material self-interest." Zelikow also assesses former U.S. presidents Theodore Roosevelt and Franklin Roosevelt as practitioners of practical idealism.
SECRETARY RICE: "Well, American foreign policy has always had, and I think rightfully had, a streak of idealism, which means that we care about values, we care about principle. It's not just getting to whatever solution is available, but it's doing that within the context of principles and values.
And at a time like this, when the world is changing very rapidly and when we have the kind of existential challenge that we have with terrorism and extremism, it's especially important to lead from values. And I don't think we've had a president in recent memory who has been so able to keep his policies centered in values.
The responsibility, then, of all of us is to take policies that are rooted in those values and make them work on a day-to-day basis so that you're always moving forward toward a goal, because nobody believes that the kinds of monumental changes that are going on in the world or that we are indeed seeking are going to happen in a week's time frame or a month's time frame or maybe even a year's time frame. So it's the connection, the day-to-day operational policy connection between those ideals and policy outcomes". - Condoleezza Rice, Washington Post interview
Singaporean diplomat and former ambassador to the United Nations Dr. Tommy Koh quoted UN Secretary-General U Thant when he described himself as a practical idealist:
"If I am neither a Realist nor a Moralist, what am I? If I have to stick a label on myself, I would quote U Thant and call myself a practical Idealist.
I believe that as a Singaporean diplomat, my primary purpose is to protect the independence, sovereignty, territorial integrity and economic well-being of the state of Singapore. I believe that I ought to pursue these objectives by means which are lawful and moral. On the rare occasions when the pursuit of my country's vital national interest compels me to do things which are legally or morally dubious,
I ought to have a bad conscience and be aware of the damage which I have done to the principle I have violated and to the reputation of my country. I believe that I must always consider the interests of other states and have a decent regard for the opinion of others.
I believe that it is in Singapore's long-term interest to strengthen international law and morality, the international system for curbing the use of force and the institutions for the pacific settlement of disputes.
Finally, I believe that it is in the interests of all nations to strengthen international co-operation and to make the world's political and economic order more stable, effective and equitable. — "Can Any Country Afford a Moral Foreign Policy?"
Critics have questioned whether practical idealism is merely a slogan with no substantive policy implications." (Gude 2005)
U.S. presidential politics:
The phrase practical idealism also was used as a slogan by John Kusumi who ran as an independent candidate in the 1984 presidential elections. This was the first introduction of the phrase in U.S. presidential politics. (United Press International 1984) (New Haven Journal Courier 1984) (New Haven Register 1984)
Former Democratic Vice President Al Gore also used the phrase in the 1990s, as did Republican Secretary of State Condoleezza Rice in the 2000s.
American political scientist Jack Godwin elaborates on the doctrine of practical idealism in The Arrow and the Olive Branch: Practical Idealism in US Foreign Policy.
See also:
In a sociological sense, idealism emphasizes how human ideas—especially beliefs and values—shape society. As an ontological doctrine, idealism goes further, asserting that all entities are composed of mind or spirit. Idealism thus rejects physicalist and dualist theories that fail to ascribe priority to the mind.
The earliest extant arguments that the world of experience is grounded in the mental derive from India and Greece. The Hindu idealists in India and the Greek Neoplatonists gave panentheistic arguments for an all-pervading consciousness as the ground or true nature of reality.
In contrast, the Yogācāra school, which arose within Mahayana Buddhism in India in the 4th century CE, based its "mind-only" idealism to a greater extent on phenomenological analyses of personal experience. This turn toward the subjective anticipated empiricists such as George Berkeley, who revived idealism in 18th-century Europe by employing skeptical arguments against materialism.
Beginning with Immanuel Kant, German idealists such as G. W. F. Hegel, Johann Gottlieb Fichte, Friedrich Wilhelm Joseph Schelling, and Arthur Schopenhauer dominated 19th-century philosophy. This tradition, which emphasized the mental or "ideal" character of all phenomena, gave birth to idealistic and subjectivist schools ranging from British idealism to phenomenalism to existentialism.
The historical influence of this branch of idealism remains central even to the schools that rejected its metaphysical assumptions, such as Marxism, pragmatism and positivism.
Click on any of the following blue hyperlinks for more about Idealism:
- Definitions
- Classical idealism
- Idealism in Indian and Buddhist thought
- Subjective idealism
- Transcendental idealism
- Objective idealism
- See also:
- Cogito ergo sum
- Mind over matter
- Neo-Vedanta
- New Thought
- Solipsism
- Spirituality
- Idealism at PhilPapers
- Idealism at the Indiana Philosophy Ontology Project
- "Idealism". Stanford Encyclopedia of Philosophy.
- "German idealism". Internet Encyclopedia of Philosophy.
- 'The Triumph of Idealism', lecture by Professor Keith Ward offering a positive view of Idealism, at Gresham College, 13 March 2008 (available in text, audio, and video download)
Pragmatism is a philosophical tradition that began in the United States around 1870. Its origins are often attributed to the philosophers William James, John Dewey, and Charles Sanders Peirce. Peirce later described it in his pragmatic maxim: "Consider the practical effects of the objects of your conception. Then, your conception of those effects is the whole of your conception of the object."
Pragmatism considers thought an instrument or tool for prediction, problem solving and action, and rejects the idea that the function of thought is to describe, represent, or mirror reality. Pragmatists contend that most philosophical topics—such as the nature of knowledge, language, concepts, meaning, belief, and science—are all best viewed in terms of their practical uses and successes.
The philosophy of pragmatism "emphasizes the practical application of ideas by acting on them to actually test them in human experiences". Pragmatism focuses on a "changing universe rather than an unchanging one as the Idealists, Realists and Thomists had claimed"
Click on any of the following blue hyperlinks for more about Pragmatism:
- Origins
- Core tenets
- Anti-reification of concepts and theories
Naturalism and anti-Cartesianism
Reconciliation of anti-skepticism and fallibilism
Pragmatist theory of truth and epistemology
- Anti-reification of concepts and theories
- In other fields of philosophy
- Philosophy of science
Logic
Metaphysics
Philosophy of mind
Ethics
Aesthetics
Philosophy of religion
- Philosophy of science
- Analytical, neoclassical, and neopragmatism
- Legacy and contemporary relevance
- Effects on social sciences
Effects on public administration
Effects on feminism
Effects on urbanism
- Effects on social sciences
- Criticisms
- List of pragmatists
- See also:
- American philosophy
- Charles Sanders Peirce bibliography
- Pragmatic theory of truth
- Pragmatism as an eighth tradition of Communication theory
- Scientific method#Pragmatic model
- New legal realism
- General sources
- Journals and organizations: There are several peer-reviewed journals dedicated to pragmatism, for example:
- Contemporary Pragmatism, affiliated with the International Pragmatism Society
- European Journal of Pragmatism and American Philosophy, affiliated with the Associazione Culturale Pragma (Italy)
- Nordic Studies in Pragmatism, journal of the Nordic Pragmatism Network
- Pragmatism Today, journal of the Central European Pragmatist Forum (CEPF)
- Transactions of the Charles S. Peirce Society, journal of the Charles S. Peirce Society
- William James Studies, journal of the William James Society
- Other online resources and organizations
- Pragmatist Sociology
- Pragmatism Cybrary
- Arisbe: The Peirce Gateway
- Centro de Estudos sobre Pragmatismo (CEP) — Center for Pragmatism Studies (CPS) (Brazil)
- Charles S. Peirce Studies
- Dutch Pragmatism Foundation
- Helsinki Peirce Research Center (Finland), including:
- Institute for American Thought
- John Dewey Society
- Neopragmatism.org
- Peirce Edition Project
- Society for the Advancement of American Philosophy
Practical idealism is a term first used by John Dewey in 1917 and subsequently adopted by Mahatma Gandhi (Gandhi Marg 2002). It describes a philosophy that holds it to be an ethical imperative to implement ideals of virtue or good. It further holds it to be equally immoral to either refuse to make the compromises necessary to realise high ideals, or to discard ideals in the name of expediency.
Practical idealism in its broadest sense may be compared to utilitarianism in its emphasis on outcomes, and to political economy and enlightened self-interest in its emphasis on the alignment of what is right with what is possible.
International Affairs:
In foreign policy and international relations, the phrase "practical idealism" has come to be taken as a theory or set of principles that diplomats or politicians use to describe or publicize their outlook on foreign policy. It purports to be a pragmatic compromise between realism, which stresses the promotion of a state's "narrow" and amoral self-interest, and idealism, which aims to use the state's influence and power to promote higher liberal ideals such as peace, justice, and co-operation between nations.
In this view, realism is seen as a prescription for Machiavellian selfishness and ruthlessness in international relations. Machiavelli recommended political strategies for reigning, or potential, princes; the infamous teachings gravitate around his vision of the overarching and ultimate goal of any Prince, remaining in power.
These strategies range from those that, today, might be called moderate or liberal political advice to those that, today, might be called illegal, immoral or, in the U.S., unconstitutional. For better or worse, Machiavelli is by name, like novelist George Orwell, modernly associated with manipulative acts and philosophies that disregard civil rights and basic human dignity in favor of deception, intimidation, and coercion.
This extreme form of realism is sometimes considered both unbecoming of nations' aspirations and, ultimately, morally and spiritually unsatisfying for their individual people. Extreme idealism, on the other hand, is associated with moralist naiveté and the failure to prioritize the interests of one's state above other goals.
More recently, practical idealism has been advocated by United States Secretary of State Condoleezza Rice and Philip D. Zelikow, in the position of counselor to the department. The latter has defended the foreign policy of the George W. Bush administration as being "motivated in good part by ideals that transcend narrow conceptions of material self-interest." Zelikow also assesses former U.S. presidents Theodore Roosevelt and Franklin Roosevelt as practitioners of practical idealism.
SECRETARY RICE: "Well, American foreign policy has always had, and I think rightfully had, a streak of idealism, which means that we care about values, we care about principle. It's not just getting to whatever solution is available, but it's doing that within the context of principles and values.
And at a time like this, when the world is changing very rapidly and when we have the kind of existential challenge that we have with terrorism and extremism, it's especially important to lead from values. And I don't think we've had a president in recent memory who has been so able to keep his policies centered in values.
The responsibility, then, of all of us is to take policies that are rooted in those values and make them work on a day-to-day basis so that you're always moving forward toward a goal, because nobody believes that the kinds of monumental changes that are going on in the world or that we are indeed seeking are going to happen in a week's time frame or a month's time frame or maybe even a year's time frame. So it's the connection, the day-to-day operational policy connection between those ideals and policy outcomes". - Condoleezza Rice, Washington Post interview
Singaporean diplomat and former ambassador to the United Nations Dr. Tommy Koh quoted UN Secretary-General U Thant when he described himself as a practical idealist:
"If I am neither a Realist nor a Moralist, what am I? If I have to stick a label on myself, I would quote U Thant and call myself a practical Idealist.
I believe that as a Singaporean diplomat, my primary purpose is to protect the independence, sovereignty, territorial integrity and economic well-being of the state of Singapore. I believe that I ought to pursue these objectives by means which are lawful and moral. On the rare occasions when the pursuit of my country's vital national interest compels me to do things which are legally or morally dubious,
I ought to have a bad conscience and be aware of the damage which I have done to the principle I have violated and to the reputation of my country. I believe that I must always consider the interests of other states and have a decent regard for the opinion of others.
I believe that it is in Singapore's long-term interest to strengthen international law and morality, the international system for curbing the use of force and the institutions for the pacific settlement of disputes.
Finally, I believe that it is in the interests of all nations to strengthen international co-operation and to make the world's political and economic order more stable, effective and equitable. — "Can Any Country Afford a Moral Foreign Policy?"
Critics have questioned whether practical idealism is merely a slogan with no substantive policy implications." (Gude 2005)
U.S. presidential politics:
The phrase practical idealism also was used as a slogan by John Kusumi who ran as an independent candidate in the 1984 presidential elections. This was the first introduction of the phrase in U.S. presidential politics. (United Press International 1984) (New Haven Journal Courier 1984) (New Haven Register 1984)
Former Democratic Vice President Al Gore also used the phrase in the 1990s, as did Republican Secretary of State Condoleezza Rice in the 2000s.
American political scientist Jack Godwin elaborates on the doctrine of practical idealism in The Arrow and the Olive Branch: Practical Idealism in US Foreign Policy.
See also:
- Gandhi's Practical Idealism, analysis by the Gandhi Peace Foundation
- "If I Were Graduation Speaker" opinion article in Christian Science Monitor by Josiah H. Brown, 24 May 1996; asks, “To what kind of work should a practical idealist aspire?”
- American Practical Idealism speech by Al Gore, 1998, Vice-President of the United States
- Canadian Practical Idealism writings by Akaash Maharaj, 1998-2003 National Policy Chair of the Liberal Party of Canada
- At State, Rice Takes Control of Diplomacy, Washington Post, 31 July 2005, and the interview transcript
The Mind and Logical Reasoning
YouTube Video: Logical argument and deductive reasoning exercise example*
*-by Khan Academy
YouTube Video: Logical argument and deductive reasoning exercise example*
*-by Khan Academy
The mind is a set of cognitive faculties including consciousness, perception, thinking, judgement, and memory.
The mind is usually defined as the faculty of an entity's thoughts and consciousness. It holds the power of imagination, recognition, and appreciation, and is responsible for processing feelings and emotions, resulting in attitudes and actions.
There is a lengthy tradition in philosophy, religion, psychology, and cognitive science about what constitutes a mind and what is its distinguishing properties.
One open question regarding the nature of the mind is the mind–body problem, which investigates the relation of the mind to the physical brain and nervous system. Pre-scientific viewpoints included dualism and idealism, which considered the mind somehow non-physical.
Modern views center around physicalism and functionalism, which hold that the mind is roughly identical with the brain or reducible to physical phenomena such as neuronal activity.
Another question concerns which types of beings are capable of having minds. For example, whether mind is exclusive to humans, possessed also by some or all animals, by all living things, whether it is a strictly definable characteristic at all, or whether mind can also be a property of some types of man-made machines.
Whatever its nature, it is generally agreed that mind is that which enables a being to have subjective awareness and intentionality towards their environment, to perceive and respond to stimuli with some kind of agency, and to have consciousness, including thinking and feeling.
The concept of mind is understood in many different ways by many different cultural and religious traditions. Some see mind as a property exclusive to humans whereas others ascribe properties of mind to non-living entities (e.g. panpsychism and animism), to animals and to deities.
Some of the earliest recorded speculations linked mind (sometimes described as identical with soul or spirit) to theories concerning both life after death, and cosmological and natural order, for example in the doctrines of Zoroaster, the Buddha, Plato, Aristotle, and other ancient Greek, Indian and, later, Islamic and medieval European philosophers.
Important philosophers of mind include:
Psychologists such as Freud and James, and computer scientists such as Turing and Putnam developed influential theories about the nature of the mind. The possibility of non-human minds is explored in the field of artificial intelligence, which works closely in relation with cybernetics and information theory to understand the ways in which information processing by non-biological machines is comparable or different to mental phenomena in the human mind.
Click on any of the following blue hyperlinks for more about the Human Mind:
There are two kinds of logical reasoning besides formal deduction: induction and abduction. Given a precondition or premise, a conclusion or logical consequence and a rule or material conditional that implies the conclusion given the precondition, one can explain that:
See also:
The mind is usually defined as the faculty of an entity's thoughts and consciousness. It holds the power of imagination, recognition, and appreciation, and is responsible for processing feelings and emotions, resulting in attitudes and actions.
There is a lengthy tradition in philosophy, religion, psychology, and cognitive science about what constitutes a mind and what is its distinguishing properties.
One open question regarding the nature of the mind is the mind–body problem, which investigates the relation of the mind to the physical brain and nervous system. Pre-scientific viewpoints included dualism and idealism, which considered the mind somehow non-physical.
Modern views center around physicalism and functionalism, which hold that the mind is roughly identical with the brain or reducible to physical phenomena such as neuronal activity.
Another question concerns which types of beings are capable of having minds. For example, whether mind is exclusive to humans, possessed also by some or all animals, by all living things, whether it is a strictly definable characteristic at all, or whether mind can also be a property of some types of man-made machines.
Whatever its nature, it is generally agreed that mind is that which enables a being to have subjective awareness and intentionality towards their environment, to perceive and respond to stimuli with some kind of agency, and to have consciousness, including thinking and feeling.
The concept of mind is understood in many different ways by many different cultural and religious traditions. Some see mind as a property exclusive to humans whereas others ascribe properties of mind to non-living entities (e.g. panpsychism and animism), to animals and to deities.
Some of the earliest recorded speculations linked mind (sometimes described as identical with soul or spirit) to theories concerning both life after death, and cosmological and natural order, for example in the doctrines of Zoroaster, the Buddha, Plato, Aristotle, and other ancient Greek, Indian and, later, Islamic and medieval European philosophers.
Important philosophers of mind include:
- Plato,
- Descartes,
- Leibniz,
- Locke,
- Berkeley,
- Hume,
- Kant,
- Hegel,
- Schopenhauer,
- Searle,
- Dennett,
- Fodor,
- Nagel,
- and Chalmers.
Psychologists such as Freud and James, and computer scientists such as Turing and Putnam developed influential theories about the nature of the mind. The possibility of non-human minds is explored in the field of artificial intelligence, which works closely in relation with cybernetics and information theory to understand the ways in which information processing by non-biological machines is comparable or different to mental phenomena in the human mind.
Click on any of the following blue hyperlinks for more about the Human Mind:
- Definitions
- Mental faculties
- Mental content
- Relation to the brain
- Evolutionary history of the human mind
- Philosophy of mind
- Scientific study
- Mental health
- Non-human minds
- In religion
- In pseudoscience
- See also:
- Outline of human intelligence – topic tree presenting the traits, capacities, models, and research fields of human intelligence, and more.
- Outline of thought – topic tree that identifies many types of thoughts, types of thinking, aspects of thought, related fields, and more.
- Cognitive sciences
- Conscience
- Consciousness
- Explanatory gap
- Hard problem of consciousness
- Ideasthesia
- Mental energy
- Mind–body problem
- Mind at Large
- Neural Darwinism
- Noogenesis
- Philosophical zombie
- Philosophy of mind
- Problem of other minds
- Sentience
- Skandha
- Subjective character of experience
- Theory of mind
There are two kinds of logical reasoning besides formal deduction: induction and abduction. Given a precondition or premise, a conclusion or logical consequence and a rule or material conditional that implies the conclusion given the precondition, one can explain that:
- Deductive reasoning determines whether the truth of a conclusion can be determined for that rule, based solely on the truth of the premises. Example: "When it rains, things outside get wet. The grass is outside, therefore: when it rains, the grass gets wet." Mathematical logic and philosophical logic are commonly associated with this type of reasoning.
- Inductive reasoning attempts to support a determination of the rule. It hypothesizes a rule after numerous examples are taken to be a conclusion that follows from a precondition in terms of such a rule. Example: "The grass got wet numerous times when it rained, therefore: the grass always gets wet when it rains." While they may be persuasive, these arguments are not deductively valid, see the problem of induction. Science is associated with this type of reasoning.
- Abductive reasoning, a.k.a. inference to the best explanation, selects a cogent set of preconditions. Given a true conclusion and a rule, it attempts to select some possible premises that, if true also, can support the conclusion, though not uniquely. Example: "When it rains, the grass gets wet. The grass is wet. Therefore, it might have rained." This kind of reasoning can be used to develop a hypothesis, which in turn can be tested by additional reasoning or data. Diagnosticians, detectives, and scientists often use this type of reasoning.
See also:
When Did the Human Mind Evolve to What It is Today? YouTube Video: The Evolution of the Human Mind.
When Did the Human Mind Evolve to What It is Today?
By Erin Wayman
Smithsonian Magazine
June 25, 2012
Archaeologists are finding signs of surprisingly sophisticated behavior in the ancient fossil record.
Archaeologists excavating a cave on the coast of South Africa not long ago unearthed an unusual abalone shell. Inside was a rusty red substance. After analyzing the mixture and nearby stone grinding tools, the researchers realized they had found the world’s earliest known paint, made 100,000 years ago from charcoal, crushed animal bones, iron-rich rock and an unknown liquid.
The abalone shell was a storage container—a prehistoric paint can.
The find revealed more than just the fact that people used paints so long ago. It provided a peek into the minds of early humans. Combining materials to create a product that doesn’t resemble the original ingredients and saving the concoction for later suggests people at the time were capable of abstract thinking, innovation and planning for the future.
These are among the mental abilities that many anthropologists say distinguished humans, Homo sapiens, from other hominids. Yet researchers have no agreed-upon definition of exactly what makes human cognition so special.
“It’s hard enough to tell what the cognitive abilities are of somebody who’s standing in front of you,” says Alison Brooks, an archaeologist at George Washington University and the Smithsonian Institution in Washington, D.C. “So it’s really hard to tell for someone who’s been dead for half a million years or a quarter million years.”
Since archaeologists can’t administer psychological tests to early humans, they have to examine artifacts left behind. When new technologies or ways of living appear in the archaeological record, anthropologists try to determine what sort of novel thinking was required to fashion a spear, say, or mix paint or collect shellfish.
The past decade has been particularly fruitful for finding such evidence. And archaeologists are now piecing together the patterns of behavior recorded in the archaeological record of the past 200,000 years to reconstruct the trajectory of how and when humans started to think and act like modern people.
There was a time when they thought they had it all figured out. In the 1970s, the consensus was simple: Modern cognition evolved in Europe 40,000 years ago. That’s when cave art, jewelry and sculpted figurines all seemed to appear for the first time. The art was a sign that humans could use symbols to represent their world and themselves, archaeologists reasoned, and therefore probably had language, too.
Neanderthals living nearby didn’t appear to make art, and thus symbolic thinking and language formed the dividing line between the two species’ mental abilities. (Today, archaeologists debate whether, and to what degree, Neanderthals were symbolic beings.)
One problem with this analysis was that the earliest fossils of modern humans came from Africa and dated to as many as 200,000 years ago—roughly 150,000 years before people were depicting bison and horses on cave walls in Spain.
Richard Klein, a paleoanthropologist at Stanford University, suggested that a genetic mutation occurred 40,000 years ago and caused an abrupt revolution in the way people thought and behaved.
In the decades following, however, archaeologists working in Africa brought down the notion that there was a lag between when the human body evolved and when modern thinking emerged. “As researchers began to more intensely investigate regions outside of Europe, the evidence of symbolic behavior got older and older,” says archaeologist April Nowell of the University of Victoria in Canada.
For instance, artifacts recovered over the past decade in South Africa— such as pigments made from red ochre, perforated shell beads and ostrich shells engraved with geometric designs—have pushed back the origins of symbolic thinking to more than 70,000 years ago, and in some cases, to as early as 164,000 years ago.
Now many anthropologists agree that modern cognition was probably in place when Homo sapiens emerged.
“It always made sense that the origins of modern human behavior, the full assembly of modern uniqueness, had to occur at the origin point of the lineage,” says Curtis Marean, a paleoanthropologist at Arizona State University in Tempe.
Marean thinks symbolic thinking was a crucial change in the evolution of the human mind. “When you have that, you have the ability to develop language. You have the ability to exchange recipes of technology,” he says. It also aided the formation of extended, long-distance social and trading networks, which other hominids such as Neanderthals lacked.
These advances enabled humans to spread into new, more complex environments, such as coastal locales, and eventually across the entire planet. “The world was their oyster,” Marean says.
But symbolic thinking may not account for all of the changes in the human mind, says Thomas Wynn, an archaeologist at the University of Colorado. Wynn and his colleague, University of Colorado psychologist Frederick Coolidge, suggest that advanced “working memory” was the final critical step toward modern cognition.
But symbolic thinking may not account for all of the changes in the human mind, says Thomas Wynn, an archaeologist at the University of Colorado. Wynn and his colleague, University of Colorado psychologist Frederick Coolidge, suggest that advanced “working memory” was the final critical step toward modern cognition.
Working memory allows the brain to retrieve, process and hold in mind several chunks of information all at one time to complete a task. A particularly sophisticated kind of working memory “involves the ability to hold something in attention while you’re being distracted,” Wynn says.
In some ways, it’s kind of like multitasking. And it’s needed in problem solving, strategizing, innovating and planning. In chess, for example, the brain has to keep track of the pieces on the board, anticipate the opponent’s next several steps and prepare (and remember) countermoves for each possible outcome.
Finding evidence of this kind of cognition is challenging because humans don’t use advanced working memory all that much. “It requires a lot of effort,” Wynn says. “If we don’t have to use it, we don’t.” Instead, during routine tasks, the brain is sort of on autopilot, like when you drive your car to work.
You’re not really thinking about it. Based on frequency alone, behaviors requiring working memory are less likely to be preserved than common activities that don’t need it, such as making simple stone choppers and handaxes.
Yet there are artifacts that do seem to relate to advanced working memory. Making tools composed of separate pieces, like a hafted spear or a bow and arrow, are examples that date to more than 70,000 years ago. But the most convincing example may be animal traps, Wynn says.
At South Africa’s Sibudu cave, Lyn Wadley, an archaeologist at the University of the Witwatersrand, has found clues that humans were hunting large numbers of small, and sometimes dangerous, forest animals, including bush pigs and diminutive antelopes called blue duikers. The only plausible way to capture such critters was with snares and traps.
With a trap, you have to think up a device that can snag and hold an animal and then return later to see whether it worked. “That’s the kind of thing working memory does for us,” Wynn says. “It allows us to work out those kinds of problems by holding the necessary information in mind.”
It may be too simple to say that symbolic thinking, language or working memory is the single thing that defines modern cognition, Marean says. And there still could be important components that haven’t yet been identified.
What’s needed now, Wynn adds, is more experimental archaeology. He suggests bringing people into a psych lab to evaluate what cognitive processes are engaged when participants make and use the tools and technology of early humans.
Another area that needs more investigation is what happened after modern cognition evolved. The pattern in the archaeological record shows a gradual accumulation of new and more sophisticated behaviors, Brooks says. Making complex tools, moving into new environments, engaging in long distance trade and wearing personal adornments didn’t all show up at once at the dawn of modern thinking.
The appearance of a slow and steady buildup may just be a consequence of the quirks of preservation. Organic materials like wood often decompose without a trace, so some signs of behavior may be too ephemeral to find. It’s also hard to spot new behaviors until they become widely adopted, so archaeologists are unlikely to ever locate the earliest instances of novel ways of living.
Complex lifestyles might not have been needed early on in the history of Homo sapiens, even if humans were capable of sophisticated thinking. Sally McBrearty, an archaeologist at the University of Connecticut in Storrs, points out in the 2007 book Rethinking the Human Revolution that certain developments might have been spurred by the need to find additional resources as populations expanded. Hunting and gathering new types of food, such as blue duikers, required new technologies.
Some see a slow progression in the accumulation of knowledge, while others see modern behavior evolving in fits and starts. Archaeologist Franceso d’Errico of the University of Bordeaux in France suggests certain advances show up early in the archaeological record only to disappear for tens of thousands of years before these behaviors—for whatever reason—get permanently incorporated into the human repertoire about 40,000 years ago. “It’s probably due to climatic changes, environmental variability and population size,” d’Errico says.
He notes that several tool technologies and aspects of symbolic expression, such as pigments and engraved artifacts, seem to disappear after 70,000 years ago. The timing coincides with a global cold spell that made Africa drier. Populations probably dwindled and fragmented in response to the climate change. Innovations might have been lost in a prehistoric version of the Dark Ages. And various groups probably reacted in different ways depending on cultural variation, d’Errico says. “Some cultures for example are more open to innovation.”
Perhaps the best way to settle whether the buildup of modern behavior was steady or punctuated is to find more archaeological sites to fill in the gaps. There are only a handful of sites, for example, that cover the beginning of human history. “We need those [sites] that date between 125,000 and 250,000 years ago,” Marean says. “That’s really the sweet spot.”
[End of Article]
___________________________________________________________________________
The evolution of human intelligence is closely tied to the evolution of the human brain and to the origin of language. The timeline of human evolution spans approximately 7 million years, from the separation of the genus Pan until the emergence of behavioral modernity by 50,000 years ago.
The first 3 million years of this timeline concern Sahelanthropus, the following 2 million concern Australopithecus and the final 2 million span the history of the genus Homo in the Paleolithic era.
Many traits of human intelligence, such as empathy, theory of mind, mourning, ritual, and the use of symbols and tools, are apparent in great apes although in less sophisticated forms than found in humans, such as great ape language.
Click on any of the following blue hyperlinks for more about The Evolution of Human Intelligence: ___________________________________________________________________________
Outline of Human Intelligence:
The following outline is provided as an overview of and topical guide to human intelligence:
Human intelligence is, in the human species, the mental capacities to learn, understand, and reason, including the capacities to comprehend ideas, plan, solve problems, and use language to communicate.
Click on any of the following blue hyperlinks to access Outline of Human Intelligence:
By Erin Wayman
Smithsonian Magazine
June 25, 2012
Archaeologists are finding signs of surprisingly sophisticated behavior in the ancient fossil record.
Archaeologists excavating a cave on the coast of South Africa not long ago unearthed an unusual abalone shell. Inside was a rusty red substance. After analyzing the mixture and nearby stone grinding tools, the researchers realized they had found the world’s earliest known paint, made 100,000 years ago from charcoal, crushed animal bones, iron-rich rock and an unknown liquid.
The abalone shell was a storage container—a prehistoric paint can.
The find revealed more than just the fact that people used paints so long ago. It provided a peek into the minds of early humans. Combining materials to create a product that doesn’t resemble the original ingredients and saving the concoction for later suggests people at the time were capable of abstract thinking, innovation and planning for the future.
These are among the mental abilities that many anthropologists say distinguished humans, Homo sapiens, from other hominids. Yet researchers have no agreed-upon definition of exactly what makes human cognition so special.
“It’s hard enough to tell what the cognitive abilities are of somebody who’s standing in front of you,” says Alison Brooks, an archaeologist at George Washington University and the Smithsonian Institution in Washington, D.C. “So it’s really hard to tell for someone who’s been dead for half a million years or a quarter million years.”
Since archaeologists can’t administer psychological tests to early humans, they have to examine artifacts left behind. When new technologies or ways of living appear in the archaeological record, anthropologists try to determine what sort of novel thinking was required to fashion a spear, say, or mix paint or collect shellfish.
The past decade has been particularly fruitful for finding such evidence. And archaeologists are now piecing together the patterns of behavior recorded in the archaeological record of the past 200,000 years to reconstruct the trajectory of how and when humans started to think and act like modern people.
There was a time when they thought they had it all figured out. In the 1970s, the consensus was simple: Modern cognition evolved in Europe 40,000 years ago. That’s when cave art, jewelry and sculpted figurines all seemed to appear for the first time. The art was a sign that humans could use symbols to represent their world and themselves, archaeologists reasoned, and therefore probably had language, too.
Neanderthals living nearby didn’t appear to make art, and thus symbolic thinking and language formed the dividing line between the two species’ mental abilities. (Today, archaeologists debate whether, and to what degree, Neanderthals were symbolic beings.)
One problem with this analysis was that the earliest fossils of modern humans came from Africa and dated to as many as 200,000 years ago—roughly 150,000 years before people were depicting bison and horses on cave walls in Spain.
Richard Klein, a paleoanthropologist at Stanford University, suggested that a genetic mutation occurred 40,000 years ago and caused an abrupt revolution in the way people thought and behaved.
In the decades following, however, archaeologists working in Africa brought down the notion that there was a lag between when the human body evolved and when modern thinking emerged. “As researchers began to more intensely investigate regions outside of Europe, the evidence of symbolic behavior got older and older,” says archaeologist April Nowell of the University of Victoria in Canada.
For instance, artifacts recovered over the past decade in South Africa— such as pigments made from red ochre, perforated shell beads and ostrich shells engraved with geometric designs—have pushed back the origins of symbolic thinking to more than 70,000 years ago, and in some cases, to as early as 164,000 years ago.
Now many anthropologists agree that modern cognition was probably in place when Homo sapiens emerged.
“It always made sense that the origins of modern human behavior, the full assembly of modern uniqueness, had to occur at the origin point of the lineage,” says Curtis Marean, a paleoanthropologist at Arizona State University in Tempe.
Marean thinks symbolic thinking was a crucial change in the evolution of the human mind. “When you have that, you have the ability to develop language. You have the ability to exchange recipes of technology,” he says. It also aided the formation of extended, long-distance social and trading networks, which other hominids such as Neanderthals lacked.
These advances enabled humans to spread into new, more complex environments, such as coastal locales, and eventually across the entire planet. “The world was their oyster,” Marean says.
But symbolic thinking may not account for all of the changes in the human mind, says Thomas Wynn, an archaeologist at the University of Colorado. Wynn and his colleague, University of Colorado psychologist Frederick Coolidge, suggest that advanced “working memory” was the final critical step toward modern cognition.
But symbolic thinking may not account for all of the changes in the human mind, says Thomas Wynn, an archaeologist at the University of Colorado. Wynn and his colleague, University of Colorado psychologist Frederick Coolidge, suggest that advanced “working memory” was the final critical step toward modern cognition.
Working memory allows the brain to retrieve, process and hold in mind several chunks of information all at one time to complete a task. A particularly sophisticated kind of working memory “involves the ability to hold something in attention while you’re being distracted,” Wynn says.
In some ways, it’s kind of like multitasking. And it’s needed in problem solving, strategizing, innovating and planning. In chess, for example, the brain has to keep track of the pieces on the board, anticipate the opponent’s next several steps and prepare (and remember) countermoves for each possible outcome.
Finding evidence of this kind of cognition is challenging because humans don’t use advanced working memory all that much. “It requires a lot of effort,” Wynn says. “If we don’t have to use it, we don’t.” Instead, during routine tasks, the brain is sort of on autopilot, like when you drive your car to work.
You’re not really thinking about it. Based on frequency alone, behaviors requiring working memory are less likely to be preserved than common activities that don’t need it, such as making simple stone choppers and handaxes.
Yet there are artifacts that do seem to relate to advanced working memory. Making tools composed of separate pieces, like a hafted spear or a bow and arrow, are examples that date to more than 70,000 years ago. But the most convincing example may be animal traps, Wynn says.
At South Africa’s Sibudu cave, Lyn Wadley, an archaeologist at the University of the Witwatersrand, has found clues that humans were hunting large numbers of small, and sometimes dangerous, forest animals, including bush pigs and diminutive antelopes called blue duikers. The only plausible way to capture such critters was with snares and traps.
With a trap, you have to think up a device that can snag and hold an animal and then return later to see whether it worked. “That’s the kind of thing working memory does for us,” Wynn says. “It allows us to work out those kinds of problems by holding the necessary information in mind.”
It may be too simple to say that symbolic thinking, language or working memory is the single thing that defines modern cognition, Marean says. And there still could be important components that haven’t yet been identified.
What’s needed now, Wynn adds, is more experimental archaeology. He suggests bringing people into a psych lab to evaluate what cognitive processes are engaged when participants make and use the tools and technology of early humans.
Another area that needs more investigation is what happened after modern cognition evolved. The pattern in the archaeological record shows a gradual accumulation of new and more sophisticated behaviors, Brooks says. Making complex tools, moving into new environments, engaging in long distance trade and wearing personal adornments didn’t all show up at once at the dawn of modern thinking.
The appearance of a slow and steady buildup may just be a consequence of the quirks of preservation. Organic materials like wood often decompose without a trace, so some signs of behavior may be too ephemeral to find. It’s also hard to spot new behaviors until they become widely adopted, so archaeologists are unlikely to ever locate the earliest instances of novel ways of living.
Complex lifestyles might not have been needed early on in the history of Homo sapiens, even if humans were capable of sophisticated thinking. Sally McBrearty, an archaeologist at the University of Connecticut in Storrs, points out in the 2007 book Rethinking the Human Revolution that certain developments might have been spurred by the need to find additional resources as populations expanded. Hunting and gathering new types of food, such as blue duikers, required new technologies.
Some see a slow progression in the accumulation of knowledge, while others see modern behavior evolving in fits and starts. Archaeologist Franceso d’Errico of the University of Bordeaux in France suggests certain advances show up early in the archaeological record only to disappear for tens of thousands of years before these behaviors—for whatever reason—get permanently incorporated into the human repertoire about 40,000 years ago. “It’s probably due to climatic changes, environmental variability and population size,” d’Errico says.
He notes that several tool technologies and aspects of symbolic expression, such as pigments and engraved artifacts, seem to disappear after 70,000 years ago. The timing coincides with a global cold spell that made Africa drier. Populations probably dwindled and fragmented in response to the climate change. Innovations might have been lost in a prehistoric version of the Dark Ages. And various groups probably reacted in different ways depending on cultural variation, d’Errico says. “Some cultures for example are more open to innovation.”
Perhaps the best way to settle whether the buildup of modern behavior was steady or punctuated is to find more archaeological sites to fill in the gaps. There are only a handful of sites, for example, that cover the beginning of human history. “We need those [sites] that date between 125,000 and 250,000 years ago,” Marean says. “That’s really the sweet spot.”
[End of Article]
___________________________________________________________________________
The evolution of human intelligence is closely tied to the evolution of the human brain and to the origin of language. The timeline of human evolution spans approximately 7 million years, from the separation of the genus Pan until the emergence of behavioral modernity by 50,000 years ago.
The first 3 million years of this timeline concern Sahelanthropus, the following 2 million concern Australopithecus and the final 2 million span the history of the genus Homo in the Paleolithic era.
Many traits of human intelligence, such as empathy, theory of mind, mourning, ritual, and the use of symbols and tools, are apparent in great apes although in less sophisticated forms than found in humans, such as great ape language.
Click on any of the following blue hyperlinks for more about The Evolution of Human Intelligence: ___________________________________________________________________________
Outline of Human Intelligence:
The following outline is provided as an overview of and topical guide to human intelligence:
Human intelligence is, in the human species, the mental capacities to learn, understand, and reason, including the capacities to comprehend ideas, plan, solve problems, and use language to communicate.
Click on any of the following blue hyperlinks to access Outline of Human Intelligence:
- Traits and aspects
- Emergence and evolution
- Augmented with technology
- Capacities
- Types of people, by intelligence
- Models and theories
- Related factors
- Fields that study human intelligence
- History
- Organizations
- Publications
- Scholars and researchers
- See also:
Perception
- YouTube Video: Visual Perception – How It Works
- YouTube Video: Mind the Gap Between Perception and Reality | Sean Tiffee | TEDxLSCTomball
- YouTube Video: The Problem with Politics: Selective Perception
Your WebHost: This web page and its opening topic present countering points of view, from the smallest of issues, to those that can even impact World Peace through manipulation and distortion of the "facts"!
Perception (from the Latin perceptio, meaning gathering or receiving) is the organization, identification, and interpretation of sensory information in order to represent and understand the presented information or environment.
All perception involves signals that go through the nervous system, which in turn result from physical or chemical stimulation of the sensory system. For example, vision involves light striking the retina of the eye; smell is mediated by odor molecules; and hearing involves pressure waves.
Perception is not only the passive receipt of these signals, but it's also shaped by the recipient's learning, memory, expectation, and attention. Sensory input is a process that transforms this low-level information to higher-level information (e.g., extracts shapes for object recognition). The process that follows connects a person's concepts and expectations (or knowledge), restorative and selective mechanisms (such as attention) that influence perception.
Perception depends on complex functions of the nervous system, but subjectively seems mostly effortless because this processing happens outside conscious awareness.
Since the rise of experimental psychology in the 19th century, psychology's understanding of perception has progressed by combining a variety of techniques.
Psychophysics quantitatively describes the relationships between the physical qualities of the sensory input and perception. Sensory neuroscience studies the neural mechanisms underlying perception. Perceptual systems can also be studied computationally, in terms of the information they process. Perceptual issues in philosophy include the extent to which sensory qualities such as sound, smell or color exist in objective reality rather than in the mind of the perceiver.
Although the senses were traditionally viewed as passive receptors, the study of illusions and ambiguous images has demonstrated that the brain's perceptual systems actively and pre-consciously attempt to make sense of their input. There is still active debate about the extent to which perception is an active process of hypothesis testing, analogous to science, or whether realistic sensory information is rich enough to make this process unnecessary.
The perceptual systems of the brain enable individuals to see the world around them as stable, even though the sensory information is typically incomplete and rapidly varying.
Human and animal brains are structured in a modular way, with different areas processing different kinds of sensory information. Some of these modules take the form of sensory maps, mapping some aspect of the world across part of the brain's surface. These different modules are interconnected and influence each other. For instance, taste is strongly influenced by smell.
"Percept" is also a term used by Deleuze and Guattari to define perception independent from perceivers.
Process and terminology:
The process of perception begins with an object in the real world, known as the distal stimulus or distal object. By means of light, sound, or another physical process, the object stimulates the body's sensory organs. These sensory organs transform the input energy into neural activity—a process called transduction. This raw pattern of neural activity is called the proximal stimulus. These neural signals are then transmitted to the brain and processed.
The resulting mental re-creation of the distal stimulus is the percept.
To explain the process of perception, an example could be an ordinary shoe. The shoe itself is the distal stimulus. When light from the shoe enters a person's eye and stimulates the retina, that stimulation is the proximal stimulus. The image of the shoe reconstructed by the brain of the person is the percept.
Another example could be a ringing telephone. The ringing of the phone is the distal stimulus. The sound stimulating a person's auditory receptors is the proximal stimulus. The brain's interpretation of this as the "ringing of a telephone" is the percept.
The different kinds of sensation (such as warmth, sound, and taste) are called sensory modalities or stimulus modalities.
Bruner's model of the perceptual process:
Psychologist Jerome Bruner developed a model of perception, in which people put "together the information contained in" a target and a situation to form "perceptions of ourselves and others based on social categories." This model is composed of three states:
Saks and John's three components to perception:
According to Alan Saks and Gary Johns, there are three components to perception:
Multistable perception:
Stimuli are not necessarily translated into a percept and rarely does a single stimulus translate into a percept. An ambiguous stimulus may sometimes be transduced into one or more percepts, experienced randomly, one at a time, in a process termed "multistable perception."
The same stimuli, or absence of them, may result in different percepts depending on subject's culture and previous experiences.
Ambiguous figures demonstrate that a single stimulus can result in more than one percept. For example, the Rubin vase can be interpreted either as a vase or as two faces. The percept can bind sensations from multiple senses into a whole. A picture of a talking person on a television screen, for example, is bound to the sound of speech from speakers to form a percept of a talking person.
Types of perception:
Vision:
Main article: Visual perception
In many ways, vision is the primary human sense. Light is taken in through each eye and focused in a way which sorts it on the retina according to direction of origin. A dense surface of photosensitive cells, including rods, cones, and intrinsically photosensitive retinal ganglion cells captures information about the intensity, color, and position of incoming light.
Some processing of texture and movement occurs within the neurons on the retina before the information is sent to the brain. In total, about 15 differing types of information are then forwarded to the brain proper via the optic nerve.
Sound:
Main article: Hearing (sense)
Hearing (or audition) is the ability to perceive sound by detecting vibrations (i.e., sonic detection). Frequencies capable of being heard by humans are called audio or audible frequencies, the range of which is typically considered to be between 20 Hz and 20,000 Hz. Frequencies higher than audio are referred to as ultrasonic, while frequencies below audio are referred to as infrasonic.
The auditory system includes the outer ears, which collect and filter sound waves; the middle ear, which transforms the sound pressure (impedance matching); and the inner ear, which produces neural signals in response to the sound. By the ascending auditory pathway these are led to the primary auditory cortex within the temporal lobe of the human brain, from where the auditory information then goes to the cerebral cortex for further processing.
Sound does not usually come from a single source: in real situations, sounds from multiple sources and directions are superimposed as they arrive at the ears. Hearing involves the computationally complex task of separating out sources of interest, identifying them and often estimating their distance and direction.
Touch:
Main article: Haptic perception
The process of recognizing objects through touch is known as haptic perception. It involves a combination of somatosensory perception of patterns on the skin surface (e.g., edges, curvature, and texture) and proprioception of hand position and conformation. People can rapidly and accurately identify three-dimensional objects by touch. This involves exploratory procedures, such as moving the fingers over the outer surface of the object or holding the entire object in the hand.
Haptic perception relies on the forces experienced during touch. Gibson defined the haptic system as "the sensibility of the individual to the world adjacent to his body by use of his body." Gibson and others emphasized the close link between body movement and haptic perception, where the latter is active exploration.
The concept of haptic perception is related to the concept of extended physiological proprioception according to which, when using a tool such as a stick, perceptual experience is transparently transferred to the end of the tool.
Taste:
Main article: Taste
Taste (formally known as gustation) is the ability to perceive the flavor of substances, including, but not limited to, food. Humans receive tastes through sensory organs concentrated on the upper surface of the tongue, called taste buds or gustatory calyculi. The human tongue has 100 to 150 taste receptor cells on each of its roughly-ten thousand taste buds.
Traditionally, there have been four primary tastes: sweetness, bitterness, sourness, and saltiness. However, the recognition and awareness of umami, which is considered the fifth primary taste, is a relatively recent development in Western cuisine.
Other tastes can be mimicked by combining these basic tastes, all of which contribute only partially to the sensation and flavor of food in the mouth.
Other factors include:
All basic tastes are classified as either appetitive or aversive, depending upon whether the things they sense are harmful or beneficial.
Smell:
Main article: Olfaction
Smell is the process of absorbing molecules through olfactory organs, which are absorbed by humans through the nose. These molecules diffuse through a thick layer of mucus; come into contact with one of thousands of cilia that are projected from sensory neurons; and are then absorbed into a receptor (one of 347 or so). It is this process that causes humans to understand the concept of smell from a physical standpoint.
Smell is also a very interactive sense as scientists have begun to observe that olfaction comes into contact with the other sense in unexpected ways. It is also the most primal of the senses, as it is known to be the first indicator of safety or danger, therefore being the sense that drives the most basic of human survival skills. As such, it can be a catalyst for human behavior on a subconscious and instinctive level.
Social:
Main article: Social perception
Social perception is the part of perception that allows people to understand the individuals and groups of their social world. Thus, it is an element of social cognition.
Speech:
Main article: Speech perception
Speech perception is the process by which spoken language is heard, interpreted and understood. Research in this field seeks to understand how human listeners recognize the sound of speech (or phonetics) and use such information to understand spoken language.
Listeners manage to perceive words across a wide range of conditions, as the sound of a word can vary widely according to words that surround it and the tempo of the speech, as well as the physical characteristics, accent, tone, and mood of the speaker.
Reverberation, signifying the persistence of sound after the sound is produced, can also have a considerable impact on perception. Experiments have shown that people automatically compensate for this effect when hearing speech.
The process of perceiving speech begins at the level of the sound within the auditory signal and the process of audition. The initial auditory signal is compared with visual information—primarily lip movement—to extract acoustic cues and phonetic information. It is possible other sensory modalities are integrated at this stage as well. This speech information can then be used for higher-level language processes, such as word recognition.
Speech perception is not necessarily unidirectional. Higher-level language processes connected with morphology, syntax, and/or semantics may also interact with basic speech perception processes to aid in recognition of speech sounds. It may be the case that it is not necessary (maybe not even possible) for a listener to recognize phonemes before recognizing higher units, such as words.
In an experiment, Richard M. Warren replaced one phoneme of a word with a cough-like sound. His subjects restored the missing speech sound perceptually without any difficulty. Moreover, they were not able to accurately identify which phoneme had even been disturbed.
Faces:
Main article: Face perception
Facial perception refers to cognitive processes specialized in handling human faces (including perceiving the identity of an individual) and facial expressions (such as emotional cues.)
Social touch:
Main article: Somatosensory system § Neural processing of social touch
The somatosensory cortex is a part of the brain that receives and encodes sensory information from receptors of the entire body.
Affective touch is a type of sensory information that elicits an emotional reaction and is usually social in nature. Such information is actually coded differently than other sensory information. Though the intensity of affective touch is still encoded in the primary somatosensory cortex, the feeling of pleasantness associated with affective touch is activated more in the anterior cingulate cortex.
Increased blood oxygen level-dependent (BOLD) contrast imaging, identified during functional magnetic resonance imaging (fMRI), shows that signals in the anterior cingulate cortex, as well as the prefrontal cortex, are highly correlated with pleasantness scores of affective touch.
Inhibitory transcranial magnetic stimulation (TMS) of the primary somatosensory cortex inhibits the perception of affective touch intensity, but not affective touch pleasantness. Therefore, the S1 is not directly involved in processing socially affective touch pleasantness, but still plays a role in discriminating touch location and intensity.
Multi-modal perception:
Multi-modal perception refers to concurrent stimulation in more than one sensory modality and the effect such has on the perception of events and objects in the world.
Time (chronoception):
Main article: time perception
Chronoception refers to how the passage of time is perceived and experienced. Although the sense of time is not associated with a specific sensory system, the work of psychologists and neuroscientists indicates that human brains do have a system governing the perception of time, composed of a highly distributed system involving the cerebral cortex, cerebellum, and basal ganglia.
One particular component of the brain, the suprachiasmatic nucleus, is responsible for the circadian rhythm (commonly known as one's "internal clock"), while other cell clusters appear to be capable of shorter-range timekeeping, known as an ultradian rhythm.
One or more dopaminergic pathways in the central nervous system appear to have a strong modulatory influence on mental chronometry, particularly interval timing.
Agency:
Main article: Sense of agency
Sense of agency refers to the subjective feeling of having chosen a particular action. Some conditions, such as schizophrenia, can cause a loss of this sense, which may lead a person into delusions, such as feeling like a machine or like an outside source is controlling them.
An opposite extreme can also occur, where people experience everything in their environment as though they had decided that it would happen.
Even in non-pathological cases, there is a measurable difference between the making of a decision and the feeling of agency. Through methods such as the Libet experiment, a gap of half a second or more can be detected from the time when there are detectable neurological signs of a decision having been made to the time when the subject actually becomes conscious of the decision.
There are also experiments in which an illusion of agency is induced in psychologically normal subjects. In 1999, psychologists Wegner and Wheatley gave subjects instructions to move a mouse around a scene and point to an image about once every thirty seconds.
However, a second person—acting as a test subject but actually a confederate—had their hand on the mouse at the same time, and controlled some of the movement. Experimenters were able to arrange for subjects to perceive certain "forced stops" as if they were their own choice.
Familiarity:
Recognition memory is sometimes divided into two functions by neuroscientists: familiarity and recollection. A strong sense of familiarity can occur without any recollection, for example in cases of deja vu.
The temporal lobe (specifically the perirhinal cortex) responds differently to stimuli that feel novel compared to stimuli that feel familiar. Firing rates in the perirhinal cortex are connected with the sense of familiarity in humans and other mammals. In tests, stimulating this area at 10–15 Hz caused animals to treat even novel images as familiar, and stimulation at 30–40 Hz caused novel images to be partially treated as familiar.
In particular, stimulation at 30–40 Hz led to animals looking at a familiar image for longer periods, as they would for an unfamiliar one, though it did not lead to the same exploration behavior normally associated with novelty.
Recent studies on lesions in the area concluded that rats with a damaged perirhinal cortex were still more interested in exploring when novel objects were present, but seemed unable to tell novel objects from familiar ones—they examined both equally. Thus, other brain regions are involved with noticing unfamiliarity, while the perirhinal cortex is needed to associate the feeling with a specific source.
Sexual stimulation:
Main article: Sexual stimulation
Sexual stimulation is any stimulus (including bodily contact) that leads to, enhances, and maintains sexual arousal, possibly even leading to orgasm. Distinct from the general sense of touch, sexual stimulation is strongly tied to hormonal activity and chemical triggers in the body.
Although sexual arousal may arise without physical stimulation, achieving orgasm usually requires physical sexual stimulation (stimulation of the Krause-Finger corpuscles found in erogenous zones of the body.)
Other senses:
Main article: Sense
Other senses enable perception of body balance, acceleration, gravity, position of body parts, temperature, and pain. They can also enable perception of internal senses, such as suffocation, gag reflex, abdominal distension, fullness of rectum and urinary bladder, and sensations felt in the throat and lungs.
Reality:
In the case of visual perception, some people can actually see the percept shift in their mind's eye. Others, who are not picture thinkers, may not necessarily perceive the 'shape-shifting' as their world changes. This esemplastic nature has been demonstrated by an experiment that showed that ambiguous images have multiple interpretations on the perceptual level.
This confusing ambiguity of perception is exploited in human technologies such as camouflage and biological mimicry. For example, the wings of European peacock butterflies bear eyespots that birds respond to as though they were the eyes of a dangerous predator.
There is also evidence that the brain in some ways operates on a slight "delay" in order to allow nerve impulses from distant parts of the body to be integrated into simultaneous signals.
Perception is one of the oldest fields in psychology. The oldest quantitative laws in psychology are Weber's law, which states that the smallest noticeable difference in stimulus intensity is proportional to the intensity of the reference; and Fechner's law, which quantifies the relationship between the intensity of the physical stimulus and its perceptual counterpart (e.g., testing how much darker a computer screen can get before the viewer actually notices).
The study of perception gave rise to the Gestalt School of Psychology, with an emphasis on holistic approach.
Physiology:
Main article: Sensory system
A sensory system is a part of the nervous system responsible for processing sensory information.
A sensory system consists of sensory receptors, neural pathways, and parts of the brain involved in sensory perception. Commonly recognized sensory systems are those for vision, hearing, somatic sensation (touch), taste and olfaction (smell), as listed above. It has been suggested that the immune system is an overlooked sensory modality. In short, senses are transducers from the physical world to the realm of the mind.
The receptive field is the specific part of the world to which a receptor organ and receptor cells respond. For instance, the part of the world an eye can see, is its receptive field; the light that each rod or cone can see, is its receptive field.
Receptive fields have been identified for the visual system, auditory system and somatosensory system, so far. Research attention is currently focused not only on external perception processes, but also to "interoception", considered as the process of receiving, accessing and appraising internal bodily signals.
Maintaining desired physiological states is critical for an organism's well-being and survival. Interoception is an iterative process, requiring the interplay between perception of body states and awareness of these states to generate proper self-regulation. Afferent sensory signals continuously interact with higher order cognitive representations of goals, history, and environment, shaping emotional experience and motivating regulatory behavior.
Features:
Constancy:
Main article: Subjective constancy
Perceptual constancy is the ability of perceptual systems to recognize the same object from widely varying sensory inputs. For example, individual people can be recognized from views, such as frontal and profile, which form very different shapes on the retina. A coin looked at face-on makes a circular image on the retina, but when held at angle it makes an elliptical image.
In normal perception these are recognized as a single three-dimensional object. Without this correction process, an animal approaching from the distance would appear to gain in size.
One kind of perceptual constancy is color constancy: for example, a white piece of paper can be recognized as such under different colors and intensities of light.
Another example is roughness constancy: when a hand is drawn quickly across a surface, the touch nerves are stimulated more intensely. The brain compensates for this, so the speed of contact does not affect the perceived roughness.
Other constancies include melody, odor, brightness and words. These constancies are not always total, but the variation in the percept is much less than the variation in the physical stimulus. The perceptual systems of the brain achieve perceptual constancy in a variety of ways, each specialized for the kind of information being processed, with phonemic restoration as a notable example from hearing.
Grouping (Gestalt):
Main article: Principles of grouping
The principles of grouping (or Gestalt laws of grouping) are a set of principles in psychology, first proposed by Gestalt psychologists, to explain how humans naturally perceive objects as organized patterns and objects. Gestalt psychologists argued that these principles exist because the mind has an innate disposition to perceive patterns in the stimulus based on certain rules.
These principles are organized into six categories:
Later research has identified additional grouping principles.
Contrast effects:
Main article: Contrast effect
A common finding across many different kinds of perception is that the perceived qualities of an object can be affected by the qualities of context. If one object is extreme on some dimension, then neighboring objects are perceived as further away from that extreme.
"Simultaneous contrast effect" is the term used when stimuli are presented at the same time, whereas successive contrast applies when stimuli are presented one after another.
The contrast effect was noted by the 17th Century philosopher John Locke, who observed that lukewarm water can feel hot or cold depending on whether the hand touching it was previously in hot or cold water. In the early 20th Century, Wilhelm Wundt identified contrast as a fundamental principle of perception, and since then the effect has been confirmed in many different areas.
These effects shape not only visual qualities like color and brightness, but other kinds of perception, including how heavy an object feels. One experiment found that thinking of the name "Hitler" led to subjects rating a person as more hostile. Whether a piece of music is perceived as good or bad can depend on whether the music heard before it was pleasant or unpleasant.
For the effect to work, the objects being compared need to be similar to each other: a television reporter can seem smaller when interviewing a tall basketball player, but not when standing next to a tall building. In the brain, brightness contrast exerts effects on both neuronal firing rates and neuronal synchrony.
Theories:
Perception as direct perception (Gibson):
Cognitive theories of perception assume there is a poverty of stimulus. This is the claim that sensations, by themselves, are unable to provide a unique description of the world. Sensations require 'enriching', which is the role of the mental model.
The perceptual ecology approach was introduced by James J. Gibson, who rejected the assumption of a poverty of stimulus and the idea that perception is based upon sensations. Instead, Gibson investigated what information is actually presented to the perceptual systems. His theory "assumes the existence of stable, unbounded, and permanent stimulus-information in the ambient optic array. And it supposes that the visual system can explore and detect this information.
The theory is information-based, not sensation-based." He and the psychologists who work within this paradigm detailed how the world could be specified to a mobile, exploring organism via the lawful projection of information about the world into energy arrays.
"Specification" would be a 1:1 mapping of some aspect of the world into a perceptual array.
Given such a mapping, no enrichment is required and perception is direct.
Perception-in-action:
From Gibson's early work derived an ecological understanding of perception known as perception-in-action, which argues that perception is a requisite property of animate action. It posits that, without perception, action would be unguided, and without action, perception would serve no purpose.
Animate actions require both perception and motion, which can be described as "two sides of the same coin, the coin is action." Gibson works from the assumption that singular entities, which he calls invariants, already exist in the real world and that all that the perception process does is home in upon them.
The constructivist view, held by such philosophers as Ernst von Glasersfeld, regards the continual adjustment of perception and action to the external input as precisely what constitutes the "entity," which is therefore far from being invariant.
Glasersfeld considers an invariant as a target to be homed in upon, and a pragmatic necessity to allow an initial measure of understanding to be established prior to the updating that a statement aims to achieve. The invariant does not, and need not, represent an actuality.
Glasersfeld describes it as extremely unlikely that what is desired or feared by an organism will never suffer change as time goes on. This social constructionist theory thus allows for a needful evolutionary adjustment.
A mathematical theory of perception-in-action has been devised and investigated in many forms of controlled movement, and has been described in many different species of organism using the General Tau Theory. According to this theory, tau information, or time-to-goal information is the fundamental percept in perception.
Evolutionary psychology (EP):
Many philosophers, such as Jerry Fodor, write that the purpose of perception is knowledge. However, evolutionary psychologists hold that the primary purpose of perception is to guide action. They give the example of depth perception, which seems to have evolved not to help us know the distances to other objects but rather to help us move around in space.
Evolutionary psychologists argue that animals ranging from fiddler crabs to humans use eyesight for collision avoidance, suggesting that vision is basically for directing action, not providing knowledge. Neuropsychologists showed that perception systems evolved along the specifics of animals' activities. This explains why bats and worms can perceive different frequency of auditory and visual systems than, for example, humans.
Building and maintaining sense organs is metabolically expensive. More than half the brain is devoted to processing sensory information, and the brain itself consumes roughly one-fourth of one's metabolic resources. Thus, such organs evolve only when they provide exceptional benefits to an organism's fitness.
Scientists who study perception and sensation have long understood the human senses as adaptations. Depth perception consists of processing over half a dozen visual cues, each of which is based on a regularity of the physical world. Vision evolved to respond to the narrow range of electromagnetic energy that is plentiful and that does not pass through objects.
Sound waves provide useful information about the sources of and distances to objects, with larger animals making and hearing lower-frequency sounds and smaller animals making and hearing higher-frequency sounds. Taste and smell respond to chemicals in the environment that were significant for fitness in the environment of evolutionary adaptedness.
The sense of touch is actually many senses, including pressure, heat, cold, tickle, and pain. Pain, while unpleasant, is adaptive. An important adaptation for senses is range shifting, by which the organism becomes temporarily more or less sensitive to sensation.
For example, one's eyes automatically adjust to dim or bright ambient light. Sensory abilities of different organisms often co-evolve, as is the case with the hearing of echolocating bats and that of the moths that have evolved to respond to the sounds that the bats make.
Evolutionary psychologists claim that perception demonstrates the principle of modularity, with specialized mechanisms handling particular perception tasks. For example, people with damage to a particular part of the brain suffer from the specific defect of not being able to recognize faces (prosopagnosia). EP suggests that this indicates a so-called face-reading module.
Closed-loop perception:
The theory of closed-loop perception proposes dynamic motor-sensory closed-loop process in which information flows through the environment and the brain in continuous loops.
Feature Integration Theory:
Main article: Feature integration theory
Anne Treisman's Feature Integration Theory (FIT) attempts to explain how characteristics of a stimulus such as physical location in space, motion, color, and shape are merged to form one percept despite each of these characteristics activating separate areas of the cortex. FIT explains this through a two part system of perception involving the preattentive and focused attention stages.
The preattentive stage of perception is largely unconscious, and analyzes an object by breaking it down into its basic features, such as the specific color, geometric shape, motion, depth, individual lines, and many others. Studies have shown that, when small groups of objects with different features (e.g., red triangle, blue circle) are briefly flashed in front of human participants, many individuals later report seeing shapes made up of the combined features of two different stimuli, thereby referred to as illusory conjunctions.
The unconnected features described in the preattentive stage are combined into the objects one normally sees during the focused attention stage. The focused attention stage is based heavily around the idea of attention in perception and 'binds' the features together onto specific objects at specific spatial locations (see the binding problem).
Other theories of perception:
Effects on perception:
Effect of experience:
Main article: Perceptual learning
With experience, organisms can learn to make finer perceptual distinctions, and learn new kinds of categorization. Wine-tasting, the reading of X-ray images and music appreciation are applications of this process in the human sphere. Research has focused on the relation of this to other kinds of learning, and whether it takes place in peripheral sensory systems or in the brain's processing of sense information.
Empirical research show that specific practices (such as yoga, mindfulness, Tai Chi, meditation, Daoshi and other mind-body disciplines) can modify human perceptual modality. Specifically, these practices enable perception skills to switch from the external (exteroceptive field) towards a higher ability to focus on internal signals (proprioception).
Also, when asked to provide verticality judgments, highly self-transcendent yoga practitioners were significantly less influenced by a misleading visual context. Increasing self-transcendence may enable yoga practitioners to optimize verticality judgment tasks by relying more on internal (vestibular and proprioceptive) signals coming from their own body, rather than on exteroceptive, visual cues.
Past actions and events that transpire right before an encounter or any form of stimulation have a strong degree of influence on how sensory stimuli are processed and perceived. On a basic level, the information our senses receive is often ambiguous and incomplete. However, they are grouped together in order for us to be able to understand the physical world around us.
But it is these various forms of stimulation, combined with our previous knowledge and experience that allows us to create our overall perception. For example, when engaging in conversation, we attempt to understand their message and words by not only paying attention to what we hear through our ears but also from the previous shapes we have seen our mouths make.
Another example would be if we had a similar topic come up in another conversation, we would use our previous knowledge to guess the direction the conversation is headed in.
Effect of motivation and expectation:
Main article: Set (psychology)
A perceptual set, also called perceptual expectancy or just set is a predisposition to perceive things in a certain way. It is an example of how perception can be shaped by "top-down" processes such as drives and expectations. Perceptual sets occur in all the different senses. They can be long term, such as a special sensitivity to hearing one's own name in a crowded room, or short term, as in the ease with which hungry people notice the smell of food.
A simple demonstration of the effect involved very brief presentations of non-words such as "sael". Subjects who were told to expect words about animals read it as "seal", but others who were expecting boat-related words read it as "sail".
Sets can be created by motivation and so can result in people interpreting ambiguous figures so that they see what they want to see. For instance, how someone perceives what unfolds during a sports game can be biased if they strongly support one of the teams.
In one experiment, students were allocated to pleasant or unpleasant tasks by a computer. They were told that either a number or a letter would flash on the screen to say whether they were going to taste an orange juice drink or an unpleasant-tasting health drink. In fact, an ambiguous figure was flashed on screen, which could either be read as the letter B or the number 13. When the letters were associated with the pleasant task, subjects were more likely to perceive a letter B, and when letters were associated with the unpleasant task they tended to perceive a number 13.
Perceptual set has been demonstrated in many social contexts. When someone has a reputation for being funny, an audience is more likely to find them amusing. Individual's perceptual sets reflect their own personality traits. For example, people with an aggressive personality are quicker to correctly identify aggressive words or situations.
One classic psychological experiment showed slower reaction times and less accurate answers when a deck of playing cards reversed the color of the suit symbol for some cards (e.g. red spades and black hearts).
Philosopher Andy Clark explains that perception, although it occurs quickly, is not simply a bottom-up process (where minute details are put together to form larger wholes). Instead, our brains use what he calls predictive coding. It starts with very broad constraints and expectations for the state of the world, and as expectations are met, it makes more detailed predictions (errors lead to new predictions, or learning processes).
Clark says this research has various implications; not only can there be no completely "unbiased, unfiltered" perception, but this means that there is a great deal of feedback between perception and expectation (perceptual experiences often shape our beliefs, but those perceptions were based on existing beliefs). Indeed, predictive coding provides an account where this type of feedback assists in stabilizing our inference-making process about the physical world, such as with perceptual constancy examples.
Click on any of the following blue hyperlinks for more about "Perceptions"
Perception (from the Latin perceptio, meaning gathering or receiving) is the organization, identification, and interpretation of sensory information in order to represent and understand the presented information or environment.
All perception involves signals that go through the nervous system, which in turn result from physical or chemical stimulation of the sensory system. For example, vision involves light striking the retina of the eye; smell is mediated by odor molecules; and hearing involves pressure waves.
Perception is not only the passive receipt of these signals, but it's also shaped by the recipient's learning, memory, expectation, and attention. Sensory input is a process that transforms this low-level information to higher-level information (e.g., extracts shapes for object recognition). The process that follows connects a person's concepts and expectations (or knowledge), restorative and selective mechanisms (such as attention) that influence perception.
Perception depends on complex functions of the nervous system, but subjectively seems mostly effortless because this processing happens outside conscious awareness.
Since the rise of experimental psychology in the 19th century, psychology's understanding of perception has progressed by combining a variety of techniques.
Psychophysics quantitatively describes the relationships between the physical qualities of the sensory input and perception. Sensory neuroscience studies the neural mechanisms underlying perception. Perceptual systems can also be studied computationally, in terms of the information they process. Perceptual issues in philosophy include the extent to which sensory qualities such as sound, smell or color exist in objective reality rather than in the mind of the perceiver.
Although the senses were traditionally viewed as passive receptors, the study of illusions and ambiguous images has demonstrated that the brain's perceptual systems actively and pre-consciously attempt to make sense of their input. There is still active debate about the extent to which perception is an active process of hypothesis testing, analogous to science, or whether realistic sensory information is rich enough to make this process unnecessary.
The perceptual systems of the brain enable individuals to see the world around them as stable, even though the sensory information is typically incomplete and rapidly varying.
Human and animal brains are structured in a modular way, with different areas processing different kinds of sensory information. Some of these modules take the form of sensory maps, mapping some aspect of the world across part of the brain's surface. These different modules are interconnected and influence each other. For instance, taste is strongly influenced by smell.
"Percept" is also a term used by Deleuze and Guattari to define perception independent from perceivers.
Process and terminology:
The process of perception begins with an object in the real world, known as the distal stimulus or distal object. By means of light, sound, or another physical process, the object stimulates the body's sensory organs. These sensory organs transform the input energy into neural activity—a process called transduction. This raw pattern of neural activity is called the proximal stimulus. These neural signals are then transmitted to the brain and processed.
The resulting mental re-creation of the distal stimulus is the percept.
To explain the process of perception, an example could be an ordinary shoe. The shoe itself is the distal stimulus. When light from the shoe enters a person's eye and stimulates the retina, that stimulation is the proximal stimulus. The image of the shoe reconstructed by the brain of the person is the percept.
Another example could be a ringing telephone. The ringing of the phone is the distal stimulus. The sound stimulating a person's auditory receptors is the proximal stimulus. The brain's interpretation of this as the "ringing of a telephone" is the percept.
The different kinds of sensation (such as warmth, sound, and taste) are called sensory modalities or stimulus modalities.
Bruner's model of the perceptual process:
Psychologist Jerome Bruner developed a model of perception, in which people put "together the information contained in" a target and a situation to form "perceptions of ourselves and others based on social categories." This model is composed of three states:
- When we encounter an unfamiliar target, we are very open to the informational cues contained in the target and the situation surrounding it.
- The first stage doesn't give us enough information on which to base perceptions of the target, so we will actively seek out cues to resolve this ambiguity. Gradually, we collect some familiar cues that enable us to make a rough categorization of the target. (see also Social Identity Theory)
- The cues become less open and selective. We try to search for more cues that confirm the categorization of the target. We also actively ignore and even distort cues that violate our initial perceptions. Our perception becomes more selective and we finally paint a consistent picture of the target.
Saks and John's three components to perception:
According to Alan Saks and Gary Johns, there are three components to perception:
- The Perceiver: a person whose awareness is focused on the stimulus, and thus begins to perceive it. There are many factors that may influence the perceptions of the perceiver, while the three major ones include (1) motivational state, (2) emotional state, and (3) experience. All of these factors, especially the first two, greatly contribute to how the person perceives a situation. Oftentimes, the perceiver may employ what is called a "perceptual defense," where the person will only "see what they want to see"—i.e., they will only perceives what they want to perceive even though the stimulus acts on his or her senses.
- The Target: the object of perception; something or someone who is being perceived. The amount of information gathered by the sensory organs of the perceiver affects the interpretation and understanding about the target.
- The Situation: the environmental factors, timing, and degree of stimulation that affect the process of perception. These factors may render a single stimulus to be left as merely a stimulus, not a percept that is subject for brain interpretation.
Multistable perception:
Stimuli are not necessarily translated into a percept and rarely does a single stimulus translate into a percept. An ambiguous stimulus may sometimes be transduced into one or more percepts, experienced randomly, one at a time, in a process termed "multistable perception."
The same stimuli, or absence of them, may result in different percepts depending on subject's culture and previous experiences.
Ambiguous figures demonstrate that a single stimulus can result in more than one percept. For example, the Rubin vase can be interpreted either as a vase or as two faces. The percept can bind sensations from multiple senses into a whole. A picture of a talking person on a television screen, for example, is bound to the sound of speech from speakers to form a percept of a talking person.
Types of perception:
Vision:
Main article: Visual perception
In many ways, vision is the primary human sense. Light is taken in through each eye and focused in a way which sorts it on the retina according to direction of origin. A dense surface of photosensitive cells, including rods, cones, and intrinsically photosensitive retinal ganglion cells captures information about the intensity, color, and position of incoming light.
Some processing of texture and movement occurs within the neurons on the retina before the information is sent to the brain. In total, about 15 differing types of information are then forwarded to the brain proper via the optic nerve.
Sound:
Main article: Hearing (sense)
Hearing (or audition) is the ability to perceive sound by detecting vibrations (i.e., sonic detection). Frequencies capable of being heard by humans are called audio or audible frequencies, the range of which is typically considered to be between 20 Hz and 20,000 Hz. Frequencies higher than audio are referred to as ultrasonic, while frequencies below audio are referred to as infrasonic.
The auditory system includes the outer ears, which collect and filter sound waves; the middle ear, which transforms the sound pressure (impedance matching); and the inner ear, which produces neural signals in response to the sound. By the ascending auditory pathway these are led to the primary auditory cortex within the temporal lobe of the human brain, from where the auditory information then goes to the cerebral cortex for further processing.
Sound does not usually come from a single source: in real situations, sounds from multiple sources and directions are superimposed as they arrive at the ears. Hearing involves the computationally complex task of separating out sources of interest, identifying them and often estimating their distance and direction.
Touch:
Main article: Haptic perception
The process of recognizing objects through touch is known as haptic perception. It involves a combination of somatosensory perception of patterns on the skin surface (e.g., edges, curvature, and texture) and proprioception of hand position and conformation. People can rapidly and accurately identify three-dimensional objects by touch. This involves exploratory procedures, such as moving the fingers over the outer surface of the object or holding the entire object in the hand.
Haptic perception relies on the forces experienced during touch. Gibson defined the haptic system as "the sensibility of the individual to the world adjacent to his body by use of his body." Gibson and others emphasized the close link between body movement and haptic perception, where the latter is active exploration.
The concept of haptic perception is related to the concept of extended physiological proprioception according to which, when using a tool such as a stick, perceptual experience is transparently transferred to the end of the tool.
Taste:
Main article: Taste
Taste (formally known as gustation) is the ability to perceive the flavor of substances, including, but not limited to, food. Humans receive tastes through sensory organs concentrated on the upper surface of the tongue, called taste buds or gustatory calyculi. The human tongue has 100 to 150 taste receptor cells on each of its roughly-ten thousand taste buds.
Traditionally, there have been four primary tastes: sweetness, bitterness, sourness, and saltiness. However, the recognition and awareness of umami, which is considered the fifth primary taste, is a relatively recent development in Western cuisine.
Other tastes can be mimicked by combining these basic tastes, all of which contribute only partially to the sensation and flavor of food in the mouth.
Other factors include:
- smell, which is detected by the olfactory epithelium of the nose;
- texture, which is detected through a variety of mechanoreceptors, muscle nerves, etc.;
- and temperature, which is detected by thermoreceptors.
All basic tastes are classified as either appetitive or aversive, depending upon whether the things they sense are harmful or beneficial.
Smell:
Main article: Olfaction
Smell is the process of absorbing molecules through olfactory organs, which are absorbed by humans through the nose. These molecules diffuse through a thick layer of mucus; come into contact with one of thousands of cilia that are projected from sensory neurons; and are then absorbed into a receptor (one of 347 or so). It is this process that causes humans to understand the concept of smell from a physical standpoint.
Smell is also a very interactive sense as scientists have begun to observe that olfaction comes into contact with the other sense in unexpected ways. It is also the most primal of the senses, as it is known to be the first indicator of safety or danger, therefore being the sense that drives the most basic of human survival skills. As such, it can be a catalyst for human behavior on a subconscious and instinctive level.
Social:
Main article: Social perception
Social perception is the part of perception that allows people to understand the individuals and groups of their social world. Thus, it is an element of social cognition.
Speech:
Main article: Speech perception
Speech perception is the process by which spoken language is heard, interpreted and understood. Research in this field seeks to understand how human listeners recognize the sound of speech (or phonetics) and use such information to understand spoken language.
Listeners manage to perceive words across a wide range of conditions, as the sound of a word can vary widely according to words that surround it and the tempo of the speech, as well as the physical characteristics, accent, tone, and mood of the speaker.
Reverberation, signifying the persistence of sound after the sound is produced, can also have a considerable impact on perception. Experiments have shown that people automatically compensate for this effect when hearing speech.
The process of perceiving speech begins at the level of the sound within the auditory signal and the process of audition. The initial auditory signal is compared with visual information—primarily lip movement—to extract acoustic cues and phonetic information. It is possible other sensory modalities are integrated at this stage as well. This speech information can then be used for higher-level language processes, such as word recognition.
Speech perception is not necessarily unidirectional. Higher-level language processes connected with morphology, syntax, and/or semantics may also interact with basic speech perception processes to aid in recognition of speech sounds. It may be the case that it is not necessary (maybe not even possible) for a listener to recognize phonemes before recognizing higher units, such as words.
In an experiment, Richard M. Warren replaced one phoneme of a word with a cough-like sound. His subjects restored the missing speech sound perceptually without any difficulty. Moreover, they were not able to accurately identify which phoneme had even been disturbed.
Faces:
Main article: Face perception
Facial perception refers to cognitive processes specialized in handling human faces (including perceiving the identity of an individual) and facial expressions (such as emotional cues.)
Social touch:
Main article: Somatosensory system § Neural processing of social touch
The somatosensory cortex is a part of the brain that receives and encodes sensory information from receptors of the entire body.
Affective touch is a type of sensory information that elicits an emotional reaction and is usually social in nature. Such information is actually coded differently than other sensory information. Though the intensity of affective touch is still encoded in the primary somatosensory cortex, the feeling of pleasantness associated with affective touch is activated more in the anterior cingulate cortex.
Increased blood oxygen level-dependent (BOLD) contrast imaging, identified during functional magnetic resonance imaging (fMRI), shows that signals in the anterior cingulate cortex, as well as the prefrontal cortex, are highly correlated with pleasantness scores of affective touch.
Inhibitory transcranial magnetic stimulation (TMS) of the primary somatosensory cortex inhibits the perception of affective touch intensity, but not affective touch pleasantness. Therefore, the S1 is not directly involved in processing socially affective touch pleasantness, but still plays a role in discriminating touch location and intensity.
Multi-modal perception:
Multi-modal perception refers to concurrent stimulation in more than one sensory modality and the effect such has on the perception of events and objects in the world.
Time (chronoception):
Main article: time perception
Chronoception refers to how the passage of time is perceived and experienced. Although the sense of time is not associated with a specific sensory system, the work of psychologists and neuroscientists indicates that human brains do have a system governing the perception of time, composed of a highly distributed system involving the cerebral cortex, cerebellum, and basal ganglia.
One particular component of the brain, the suprachiasmatic nucleus, is responsible for the circadian rhythm (commonly known as one's "internal clock"), while other cell clusters appear to be capable of shorter-range timekeeping, known as an ultradian rhythm.
One or more dopaminergic pathways in the central nervous system appear to have a strong modulatory influence on mental chronometry, particularly interval timing.
Agency:
Main article: Sense of agency
Sense of agency refers to the subjective feeling of having chosen a particular action. Some conditions, such as schizophrenia, can cause a loss of this sense, which may lead a person into delusions, such as feeling like a machine or like an outside source is controlling them.
An opposite extreme can also occur, where people experience everything in their environment as though they had decided that it would happen.
Even in non-pathological cases, there is a measurable difference between the making of a decision and the feeling of agency. Through methods such as the Libet experiment, a gap of half a second or more can be detected from the time when there are detectable neurological signs of a decision having been made to the time when the subject actually becomes conscious of the decision.
There are also experiments in which an illusion of agency is induced in psychologically normal subjects. In 1999, psychologists Wegner and Wheatley gave subjects instructions to move a mouse around a scene and point to an image about once every thirty seconds.
However, a second person—acting as a test subject but actually a confederate—had their hand on the mouse at the same time, and controlled some of the movement. Experimenters were able to arrange for subjects to perceive certain "forced stops" as if they were their own choice.
Familiarity:
Recognition memory is sometimes divided into two functions by neuroscientists: familiarity and recollection. A strong sense of familiarity can occur without any recollection, for example in cases of deja vu.
The temporal lobe (specifically the perirhinal cortex) responds differently to stimuli that feel novel compared to stimuli that feel familiar. Firing rates in the perirhinal cortex are connected with the sense of familiarity in humans and other mammals. In tests, stimulating this area at 10–15 Hz caused animals to treat even novel images as familiar, and stimulation at 30–40 Hz caused novel images to be partially treated as familiar.
In particular, stimulation at 30–40 Hz led to animals looking at a familiar image for longer periods, as they would for an unfamiliar one, though it did not lead to the same exploration behavior normally associated with novelty.
Recent studies on lesions in the area concluded that rats with a damaged perirhinal cortex were still more interested in exploring when novel objects were present, but seemed unable to tell novel objects from familiar ones—they examined both equally. Thus, other brain regions are involved with noticing unfamiliarity, while the perirhinal cortex is needed to associate the feeling with a specific source.
Sexual stimulation:
Main article: Sexual stimulation
Sexual stimulation is any stimulus (including bodily contact) that leads to, enhances, and maintains sexual arousal, possibly even leading to orgasm. Distinct from the general sense of touch, sexual stimulation is strongly tied to hormonal activity and chemical triggers in the body.
Although sexual arousal may arise without physical stimulation, achieving orgasm usually requires physical sexual stimulation (stimulation of the Krause-Finger corpuscles found in erogenous zones of the body.)
Other senses:
Main article: Sense
Other senses enable perception of body balance, acceleration, gravity, position of body parts, temperature, and pain. They can also enable perception of internal senses, such as suffocation, gag reflex, abdominal distension, fullness of rectum and urinary bladder, and sensations felt in the throat and lungs.
Reality:
In the case of visual perception, some people can actually see the percept shift in their mind's eye. Others, who are not picture thinkers, may not necessarily perceive the 'shape-shifting' as their world changes. This esemplastic nature has been demonstrated by an experiment that showed that ambiguous images have multiple interpretations on the perceptual level.
This confusing ambiguity of perception is exploited in human technologies such as camouflage and biological mimicry. For example, the wings of European peacock butterflies bear eyespots that birds respond to as though they were the eyes of a dangerous predator.
There is also evidence that the brain in some ways operates on a slight "delay" in order to allow nerve impulses from distant parts of the body to be integrated into simultaneous signals.
Perception is one of the oldest fields in psychology. The oldest quantitative laws in psychology are Weber's law, which states that the smallest noticeable difference in stimulus intensity is proportional to the intensity of the reference; and Fechner's law, which quantifies the relationship between the intensity of the physical stimulus and its perceptual counterpart (e.g., testing how much darker a computer screen can get before the viewer actually notices).
The study of perception gave rise to the Gestalt School of Psychology, with an emphasis on holistic approach.
Physiology:
Main article: Sensory system
A sensory system is a part of the nervous system responsible for processing sensory information.
A sensory system consists of sensory receptors, neural pathways, and parts of the brain involved in sensory perception. Commonly recognized sensory systems are those for vision, hearing, somatic sensation (touch), taste and olfaction (smell), as listed above. It has been suggested that the immune system is an overlooked sensory modality. In short, senses are transducers from the physical world to the realm of the mind.
The receptive field is the specific part of the world to which a receptor organ and receptor cells respond. For instance, the part of the world an eye can see, is its receptive field; the light that each rod or cone can see, is its receptive field.
Receptive fields have been identified for the visual system, auditory system and somatosensory system, so far. Research attention is currently focused not only on external perception processes, but also to "interoception", considered as the process of receiving, accessing and appraising internal bodily signals.
Maintaining desired physiological states is critical for an organism's well-being and survival. Interoception is an iterative process, requiring the interplay between perception of body states and awareness of these states to generate proper self-regulation. Afferent sensory signals continuously interact with higher order cognitive representations of goals, history, and environment, shaping emotional experience and motivating regulatory behavior.
Features:
Constancy:
Main article: Subjective constancy
Perceptual constancy is the ability of perceptual systems to recognize the same object from widely varying sensory inputs. For example, individual people can be recognized from views, such as frontal and profile, which form very different shapes on the retina. A coin looked at face-on makes a circular image on the retina, but when held at angle it makes an elliptical image.
In normal perception these are recognized as a single three-dimensional object. Without this correction process, an animal approaching from the distance would appear to gain in size.
One kind of perceptual constancy is color constancy: for example, a white piece of paper can be recognized as such under different colors and intensities of light.
Another example is roughness constancy: when a hand is drawn quickly across a surface, the touch nerves are stimulated more intensely. The brain compensates for this, so the speed of contact does not affect the perceived roughness.
Other constancies include melody, odor, brightness and words. These constancies are not always total, but the variation in the percept is much less than the variation in the physical stimulus. The perceptual systems of the brain achieve perceptual constancy in a variety of ways, each specialized for the kind of information being processed, with phonemic restoration as a notable example from hearing.
Grouping (Gestalt):
Main article: Principles of grouping
The principles of grouping (or Gestalt laws of grouping) are a set of principles in psychology, first proposed by Gestalt psychologists, to explain how humans naturally perceive objects as organized patterns and objects. Gestalt psychologists argued that these principles exist because the mind has an innate disposition to perceive patterns in the stimulus based on certain rules.
These principles are organized into six categories:
- Proximity: the principle of proximity states that, all else being equal, perception tends to group stimuli that are close together as part of the same object, and stimuli that are far apart as two separate objects.
- Similarity: the principle of similarity states that, all else being equal, perception lends itself to seeing stimuli that physically resemble each other as part of the same object and that are different as part of a separate object. This allows for people to distinguish between adjacent and overlapping objects based on their visual texture and resemblance.
- Closure: the principle of closure refers to the mind's tendency to see complete figures or forms even if a picture is incomplete, partially hidden by other objects, or if part of the information needed to make a complete picture in our minds is missing. For example, if part of a shape's border is missing people still tend to see the shape as completely enclosed by the border and ignore the gaps.
- Good Continuation: the principle of good continuation makes sense of stimuli that overlap: when there is an intersection between two or more objects, people tend to perceive each as a single uninterrupted object.
- Common Fate: the principle of common fate groups stimuli together on the basis of their movement. When visual elements are seen moving in the same direction at the same rate, perception associates the movement as part of the same stimulus. This allows people to make out moving objects even when other details, such as color or outline, are obscured.
- The principle of good form refers to the tendency to group together forms of similar shape, pattern, color, etc.
Later research has identified additional grouping principles.
Contrast effects:
Main article: Contrast effect
A common finding across many different kinds of perception is that the perceived qualities of an object can be affected by the qualities of context. If one object is extreme on some dimension, then neighboring objects are perceived as further away from that extreme.
"Simultaneous contrast effect" is the term used when stimuli are presented at the same time, whereas successive contrast applies when stimuli are presented one after another.
The contrast effect was noted by the 17th Century philosopher John Locke, who observed that lukewarm water can feel hot or cold depending on whether the hand touching it was previously in hot or cold water. In the early 20th Century, Wilhelm Wundt identified contrast as a fundamental principle of perception, and since then the effect has been confirmed in many different areas.
These effects shape not only visual qualities like color and brightness, but other kinds of perception, including how heavy an object feels. One experiment found that thinking of the name "Hitler" led to subjects rating a person as more hostile. Whether a piece of music is perceived as good or bad can depend on whether the music heard before it was pleasant or unpleasant.
For the effect to work, the objects being compared need to be similar to each other: a television reporter can seem smaller when interviewing a tall basketball player, but not when standing next to a tall building. In the brain, brightness contrast exerts effects on both neuronal firing rates and neuronal synchrony.
Theories:
Perception as direct perception (Gibson):
Cognitive theories of perception assume there is a poverty of stimulus. This is the claim that sensations, by themselves, are unable to provide a unique description of the world. Sensations require 'enriching', which is the role of the mental model.
The perceptual ecology approach was introduced by James J. Gibson, who rejected the assumption of a poverty of stimulus and the idea that perception is based upon sensations. Instead, Gibson investigated what information is actually presented to the perceptual systems. His theory "assumes the existence of stable, unbounded, and permanent stimulus-information in the ambient optic array. And it supposes that the visual system can explore and detect this information.
The theory is information-based, not sensation-based." He and the psychologists who work within this paradigm detailed how the world could be specified to a mobile, exploring organism via the lawful projection of information about the world into energy arrays.
"Specification" would be a 1:1 mapping of some aspect of the world into a perceptual array.
Given such a mapping, no enrichment is required and perception is direct.
Perception-in-action:
From Gibson's early work derived an ecological understanding of perception known as perception-in-action, which argues that perception is a requisite property of animate action. It posits that, without perception, action would be unguided, and without action, perception would serve no purpose.
Animate actions require both perception and motion, which can be described as "two sides of the same coin, the coin is action." Gibson works from the assumption that singular entities, which he calls invariants, already exist in the real world and that all that the perception process does is home in upon them.
The constructivist view, held by such philosophers as Ernst von Glasersfeld, regards the continual adjustment of perception and action to the external input as precisely what constitutes the "entity," which is therefore far from being invariant.
Glasersfeld considers an invariant as a target to be homed in upon, and a pragmatic necessity to allow an initial measure of understanding to be established prior to the updating that a statement aims to achieve. The invariant does not, and need not, represent an actuality.
Glasersfeld describes it as extremely unlikely that what is desired or feared by an organism will never suffer change as time goes on. This social constructionist theory thus allows for a needful evolutionary adjustment.
A mathematical theory of perception-in-action has been devised and investigated in many forms of controlled movement, and has been described in many different species of organism using the General Tau Theory. According to this theory, tau information, or time-to-goal information is the fundamental percept in perception.
Evolutionary psychology (EP):
Many philosophers, such as Jerry Fodor, write that the purpose of perception is knowledge. However, evolutionary psychologists hold that the primary purpose of perception is to guide action. They give the example of depth perception, which seems to have evolved not to help us know the distances to other objects but rather to help us move around in space.
Evolutionary psychologists argue that animals ranging from fiddler crabs to humans use eyesight for collision avoidance, suggesting that vision is basically for directing action, not providing knowledge. Neuropsychologists showed that perception systems evolved along the specifics of animals' activities. This explains why bats and worms can perceive different frequency of auditory and visual systems than, for example, humans.
Building and maintaining sense organs is metabolically expensive. More than half the brain is devoted to processing sensory information, and the brain itself consumes roughly one-fourth of one's metabolic resources. Thus, such organs evolve only when they provide exceptional benefits to an organism's fitness.
Scientists who study perception and sensation have long understood the human senses as adaptations. Depth perception consists of processing over half a dozen visual cues, each of which is based on a regularity of the physical world. Vision evolved to respond to the narrow range of electromagnetic energy that is plentiful and that does not pass through objects.
Sound waves provide useful information about the sources of and distances to objects, with larger animals making and hearing lower-frequency sounds and smaller animals making and hearing higher-frequency sounds. Taste and smell respond to chemicals in the environment that were significant for fitness in the environment of evolutionary adaptedness.
The sense of touch is actually many senses, including pressure, heat, cold, tickle, and pain. Pain, while unpleasant, is adaptive. An important adaptation for senses is range shifting, by which the organism becomes temporarily more or less sensitive to sensation.
For example, one's eyes automatically adjust to dim or bright ambient light. Sensory abilities of different organisms often co-evolve, as is the case with the hearing of echolocating bats and that of the moths that have evolved to respond to the sounds that the bats make.
Evolutionary psychologists claim that perception demonstrates the principle of modularity, with specialized mechanisms handling particular perception tasks. For example, people with damage to a particular part of the brain suffer from the specific defect of not being able to recognize faces (prosopagnosia). EP suggests that this indicates a so-called face-reading module.
Closed-loop perception:
The theory of closed-loop perception proposes dynamic motor-sensory closed-loop process in which information flows through the environment and the brain in continuous loops.
Feature Integration Theory:
Main article: Feature integration theory
Anne Treisman's Feature Integration Theory (FIT) attempts to explain how characteristics of a stimulus such as physical location in space, motion, color, and shape are merged to form one percept despite each of these characteristics activating separate areas of the cortex. FIT explains this through a two part system of perception involving the preattentive and focused attention stages.
The preattentive stage of perception is largely unconscious, and analyzes an object by breaking it down into its basic features, such as the specific color, geometric shape, motion, depth, individual lines, and many others. Studies have shown that, when small groups of objects with different features (e.g., red triangle, blue circle) are briefly flashed in front of human participants, many individuals later report seeing shapes made up of the combined features of two different stimuli, thereby referred to as illusory conjunctions.
The unconnected features described in the preattentive stage are combined into the objects one normally sees during the focused attention stage. The focused attention stage is based heavily around the idea of attention in perception and 'binds' the features together onto specific objects at specific spatial locations (see the binding problem).
Other theories of perception:
- Empirical Theory of Perception
- Enactivism
- The Interactive Activation and Competition Model
- Recognition-By-Components Theory (Irving Biederman)
Effects on perception:
Effect of experience:
Main article: Perceptual learning
With experience, organisms can learn to make finer perceptual distinctions, and learn new kinds of categorization. Wine-tasting, the reading of X-ray images and music appreciation are applications of this process in the human sphere. Research has focused on the relation of this to other kinds of learning, and whether it takes place in peripheral sensory systems or in the brain's processing of sense information.
Empirical research show that specific practices (such as yoga, mindfulness, Tai Chi, meditation, Daoshi and other mind-body disciplines) can modify human perceptual modality. Specifically, these practices enable perception skills to switch from the external (exteroceptive field) towards a higher ability to focus on internal signals (proprioception).
Also, when asked to provide verticality judgments, highly self-transcendent yoga practitioners were significantly less influenced by a misleading visual context. Increasing self-transcendence may enable yoga practitioners to optimize verticality judgment tasks by relying more on internal (vestibular and proprioceptive) signals coming from their own body, rather than on exteroceptive, visual cues.
Past actions and events that transpire right before an encounter or any form of stimulation have a strong degree of influence on how sensory stimuli are processed and perceived. On a basic level, the information our senses receive is often ambiguous and incomplete. However, they are grouped together in order for us to be able to understand the physical world around us.
But it is these various forms of stimulation, combined with our previous knowledge and experience that allows us to create our overall perception. For example, when engaging in conversation, we attempt to understand their message and words by not only paying attention to what we hear through our ears but also from the previous shapes we have seen our mouths make.
Another example would be if we had a similar topic come up in another conversation, we would use our previous knowledge to guess the direction the conversation is headed in.
Effect of motivation and expectation:
Main article: Set (psychology)
A perceptual set, also called perceptual expectancy or just set is a predisposition to perceive things in a certain way. It is an example of how perception can be shaped by "top-down" processes such as drives and expectations. Perceptual sets occur in all the different senses. They can be long term, such as a special sensitivity to hearing one's own name in a crowded room, or short term, as in the ease with which hungry people notice the smell of food.
A simple demonstration of the effect involved very brief presentations of non-words such as "sael". Subjects who were told to expect words about animals read it as "seal", but others who were expecting boat-related words read it as "sail".
Sets can be created by motivation and so can result in people interpreting ambiguous figures so that they see what they want to see. For instance, how someone perceives what unfolds during a sports game can be biased if they strongly support one of the teams.
In one experiment, students were allocated to pleasant or unpleasant tasks by a computer. They were told that either a number or a letter would flash on the screen to say whether they were going to taste an orange juice drink or an unpleasant-tasting health drink. In fact, an ambiguous figure was flashed on screen, which could either be read as the letter B or the number 13. When the letters were associated with the pleasant task, subjects were more likely to perceive a letter B, and when letters were associated with the unpleasant task they tended to perceive a number 13.
Perceptual set has been demonstrated in many social contexts. When someone has a reputation for being funny, an audience is more likely to find them amusing. Individual's perceptual sets reflect their own personality traits. For example, people with an aggressive personality are quicker to correctly identify aggressive words or situations.
One classic psychological experiment showed slower reaction times and less accurate answers when a deck of playing cards reversed the color of the suit symbol for some cards (e.g. red spades and black hearts).
Philosopher Andy Clark explains that perception, although it occurs quickly, is not simply a bottom-up process (where minute details are put together to form larger wholes). Instead, our brains use what he calls predictive coding. It starts with very broad constraints and expectations for the state of the world, and as expectations are met, it makes more detailed predictions (errors lead to new predictions, or learning processes).
Clark says this research has various implications; not only can there be no completely "unbiased, unfiltered" perception, but this means that there is a great deal of feedback between perception and expectation (perceptual experiences often shape our beliefs, but those perceptions were based on existing beliefs). Indeed, predictive coding provides an account where this type of feedback assists in stabilizing our inference-making process about the physical world, such as with perceptual constancy examples.
Click on any of the following blue hyperlinks for more about "Perceptions"
- Action-specific perception
- Alice in Wonderland syndrome
- Apophenia
- Binding Problem
- Change blindness
- Experience model
- Feeling
- Generic views
- Ideasthesia
- Introspection
- Model-dependent realism
- Multisensory integration
- Near sets
- Neural correlates of consciousness
- Pareidolia
- Perceptual paradox
- Philosophy of perception
- Proprioception
- Qualia
- Recept
- Samjñā, the Buddhist concept of perception
- Simulated reality
- Simulation
- Transsaccadic memory
- Visual routine
- Theories of Perception Several different aspects on perception
- Richard L Gregory Theories of Richard. L. Gregory.
- Comprehensive set of optical illusions, presented by Michael Bach.
- Optical Illusions Examples of well-known optical illusions.
- The Epistemology of Perception Article in the Internet Encyclopedia of Philosophy
- Cognitive Penetrability of Perception and Epistemic Justification Article in the Internet Encyclopedia of Philosophy