Pode ser necessário instalar o Flash Player para fazer o download do documento.
—
Publicada por Jorge Barbosa em SÍSIFO a 7/26/2009 01:30:00 AM
Pode ser necessário instalar o Flash Player para fazer o download do documento.
—
Publicada por Jorge Barbosa em SÍSIFO a 7/26/2009 01:30:00 AM
Pode ser necessário instalar o Flash Player para fazer o download do documento.
As ideias dos políticos podem ser chatas, injustas, erradas, disparatadas, etc…. mas é bom que se diga que não são poluentes. Na verdade, são biodegradáveis.
Tendo em boa conta este carácter biodegradável das suas ideias, era ususal os políticos fazerem a sua autobiografia, para que se conservassem. Embora algumas autobiografias sejam realmente poluentes e de muito mau gosto, é verdade que algumas outras trouxeram algum perfume às coisas. Tudo parecia compensado. O perfume de umas anulava o mau cheiro de outras, e ficava tudo na mesma.
Mas agora os políticos não fazem autobiografias; os políticos actuais autobiodegradam-se. Podíamos ficar felizes com a ideia, mas não devemos. Com efeito, depois de se autobiodegradarem, os políticos aguardam pacientemente que algum empreendimento de reciclagem os recupere do caixote do lixo, onde se meteram.
Então temos (entre outros) que:
- um presidente de câmara e primeiro ministro duplamente autobiodegradado foi recentemente reciclado e regressa à actividade política, como se já não tivesse cheirado mal o quanto bastasse.
- uma antiga ministra da educação completamente autobiodegradada, acaba por ser reciclada para se candidatar a primeiro-ministro;
Quem nos garante que a actual ministra da educação, em processo rapidíssimo de autobiodegradação, não venha também a ser reciclada para se candidatar também a primeiro-ministro nos próximos tempos?
E os outros, também em regime apressado de autobiodegradação?
Por isso, enquanto os políticos não voltarem a ter a coragem de se autobiografarem, continuando a autobiodegradar-se, sempre que me digam para escolher um, ou uma dúzia, ou mais, ou menos, eu voto em branco.
Que se autobiografem e que nos deixem em paz.
—
Publicada por Jorge Barbosa em SÍSIFO a 7/24/2009 10:24:00 PM
As ideias dos políticos podem ser chatas, injustas, erradas, disparatadas, etc…. mas é bom que se diga que não são poluentes. Na verdade, são biodegradáveis.
Tendo em boa conta este carácter biodegradável das suas ideias, era ususal os políticos fazerem a sua autobiografia, para que se conservassem. Embora algumas autobiografias sejam realmente poluentes e de muito mau gosto, é verdade que algumas outras trouxeram algum perfume às coisas. Tudo parecia compensado. O perfume de umas anulava o mau cheiro de outras, e ficava tudo na mesma.
Mas agora os políticos não fazem autobiografias; os políticos actuais autobiodegradam-se. Podíamos ficar felizes com a ideia, mas não devemos. Com efeito, depois de se autobiodegradarem, os políticos aguardam pacientemente que algum empreendimento de reciclagem os recupere do caixote do lixo, onde se meteram.
Então temos (entre outros) que:
Quem nos garante que a actual ministra da educação, em processo rapidíssimo de autobiodegradação, não venha também a ser reciclada para se candidatar também a primeiro-ministro nos próximos tempos?
E os outros, também em regime apressado de autobiodegradação?
Por isso, enquanto os políticos não voltarem a ter a coragem de se autobiografarem, continuando a autobiodegradar-se, sempre que me digam para escolher um, ou uma dúzia, ou mais, ou menos, eu voto em branco.
Que se autobiografem e que nos deixem em paz.
KIMBERLEY BROWNLEE 1
1 School of Social Science, University of Manchester, Oxford Road, Manchester M13 9PL, UK
Correspondence to Kimberley Brownlee, Manchester Centre for Political Theory, School of Social Science, University of Manchester, Oxford Road, Manchester M13 9PL, UK.
Copyright © 2009 Society For Applied Philosophy
ABSTRACT
This article briefly examines Onora O’Neill’s account of the relation between normative principles and practical ethical problems with an eye to suggesting that philosophers of practical ethics have reason to adopt fairly high moral ambitions to be edifying and instructive both as educators and as advisors on public policy debates.
—
Publicada por Jorge Barbosa em SÍSIFO a 7/24/2009 01:48:00 PM
This article briefly examines Onora O’Neill’s account of the relation between normative principles and practical ethical problems with an eye to suggesting that philosophers of practical ethics have reason to adopt fairly high moral ambitions to be edifying and instructive both as educators and as advisors on public policy debates.
ONORA O’NEILL 1
1 Department of Philosophy, University of Cambridge, Sidgwick Avenue, Cambridge CB3 9DA, UK
Correspondence to Onora O’Neill, Department of Philosophy, University of Cambridge, Sidgwick Avenue, Cambridge CB3 9DA, UK. oon20@cam.ac.uk
Copyright © 2009 Society For Applied Philosophy
ABSTRACT
Normative argument is supposed to guide ways in which we might change the world, rather than to fit the world as it is. This poses certain difficulties for the notion of applied ethics. Taken literally the phrase ‘applied ethics’ suggests that principles or standards with substantial philosophical justification, in particular ethical and political principles with such justification, are applied to particular cases and guide action. However, the ‘cases’ which applied ethics discusses are themselves indeterminate, and the relation of principles to these ‘cases’ differs from the relation of principles to cases in naturalistic, truth-oriented inquiry. Writing in ‘applied ethics’, I shall argue, does not need elaborate case histories or scenarios, since the testing points for normative principles are other normative principles rather than particular cases. Normative principles and contexts to which they are applicable are indeed needed for any reasoning that is practical, but they are not sufficient. Practical ethics needs principles that can not merely be applied in certain cases or situations, but also enacted in certain ways, and requires an account of practical judgement and of the public policies that support that judgement.
—
Publicada por Jorge Barbosa em SÍSIFO a 7/24/2009 01:38:00 PM
Normative argument is supposed to guide ways in which we might change the world, rather than to fit the world as it is. This poses certain difficulties for the notion of applied ethics. Taken literally the phrase ‘applied ethics’ suggests that principles or standards with substantial philosophical justification, in particular ethical and political principles with such justification, are applied to particular cases and guide action. However, the ‘cases’ which applied ethics discusses are themselves indeterminate, and the relation of principles to these ‘cases’ differs from the relation of principles to cases in naturalistic, truth-oriented inquiry. Writing in ‘applied ethics’, I shall argue, does not need elaborate case histories or scenarios, since the testing points for normative principles are other normative principles rather than particular cases. Normative principles and contexts to which they are applicable are indeed needed for any reasoning that is practical, but they are not sufficient. Practical ethics needs principles that can not merely be applied in certain cases or situations, but also enacted in certain ways, and requires an account of practical judgement and of the public policies that support that judgement.
The benefits of youth
Authors: David F. Bjorklund a; Virginia Periss a; Kayla Causey a
Affiliation:
a Florida Atlantic University, Boca Raton, Florida, USADOI: 10.1080/17405620802602334
Publication Frequency: 6 issues per year
Published in: European Journal of Developmental Psychology, Volume 6, Issue 1 January 2009 , pages 120 – 137
Ver TEXTO COMPLETO
Abstract
We examine human psychological development from an evolutionary perspective and propose that some aspects of developmental immaturity have been selected for their adaptive value either for individuals at a specific time in development (ontogenetic adaptations) or as preparation for adulthood (deferred adaptations). We review research and theory on the possible adaptive role of immaturity in human development, focusing on play, neural plasticity, and cognitive limitations that may foster the development of sensory systems, learning/education, and caretaking by adults. We argue that considering the possible adaptive features of immaturity, along with the obvious maladaptive ones, provides a more accurate picture of ontogeny and ways to foster healthy psychological development.
Keywords: Cognitive immaturity; Evolutionary development; Ontogenetic adaptation
As the title suggests, our thesis focuses on the benefits of youth, or more generally, on the adaptive nature of immaturity. We view development as more than a “means-to-an-end” process that produces mature adults: While each stage in development, from infancy through childhood and on to adolescence, serves a unique purpose in preparing children for adulthood, these stages also function to adapt children to their immediate environment, or developmental “niche.” This perspective is explicitly informed by evolutionary theory, and before delving into the “benefits of youth,” we examine some of the assumptions of an evolutionary approach to human psychological development.
Evolutionary Developmental Psychology
From the perspective of evolutionary developmental psychology (e.g., Bjorklund & Hernndez-Blasi, 2005; Bjorklund & Pellegrini, 2000, 2002; Burgess & McDonald, 2005; Ellis & Bjorklund, 2005; Geary & Bjorklund, 2000), natural selection operates at all stages of development, not just adulthood. Moreover, natural selection has not acted equally on all stages. To pass on their genes, individuals must survive birth, develop to sexual maturity, find a mate with which to reproduce, and successfully raise offspring to reproductive age. Therefore, selection will have its greatest effects on the early stages of ontogeny, and any benefits that encourage development through these stages will be favored, even if they are harmful later in life. For example, the very sex hormones that promote reproduction in Pacific salmon lead to their rapid post-reproductive death. In fact, if a salmon’s gonads are removed it can no longer reproduce, of course, but lives a much longer life (Robertson, 1961). In humans, there is some evidence that testosterone provides early benefits for males associated with social status and dominance, yet may suppress the immune system and increase risks of cancer and heart disease later in life (see Geary, 1998). Organisms are also at greater risk of death early in life, lacking the skills necessary for independent living. This is especially true for animals with extended developmental periods such as Homo sapiens and other nonhuman primates. Accordingly, mechanisms of evolutionary change focus on the developing organism more so than on the adult.
Humans’ extended youth
One characteristic of Homo sapiens is its prolonged period of immaturity. The closer a species’ common ancestor is with Homo sapiens, the longer its period of immaturity: approximately 2 years in lemurs, 4 years in macaques, and 8 years in chimpanzees, compared to approximately 15 years in humans (Poirier & Smith, 1974). According to anthropologist Barry Bogin (1999, 2003), humans’ extended period of immaturity was afforded, in part, by the invention of two new life stages: childhood, approximating the 3 to 7 years between infancy and the juvenile period, and adolescence, a brief 2 to 3 years following menarche, each characterized by distinct physical and psychological characteristics.
All mammal infants are dependent on their parents, especially their mothers, for care, but childhood, in particular, extends the period of high parental investment required to survive. Children cannot acquire or prepare food themselves (they still have their primary, or “baby teeth,” and many foods have to be prepared specially for them), and their physical skills are greatly limited. Moreover, although children display representational (symbolic) thought at this time, they are still largely egocentric (have difficulty considering a perspective other than their own) and demonstrate illogical thinking, which Piaget (1983) described as preoperational (i.e., lacking the logical operations characteristic of children in the juvenile, or concrete operational, period).
This extended period of dependency and delayed sexual maturation can be very costly. Caring for dependent children limits the number of offspring a woman can have, and being dependent on others for many years has its own drawbacks, including susceptibility to predation, starvation, and disease, to name a few. Moreover, the longer one delays sexual maturity, the greater the chance that one will die before reproducing. In biology, when there are great costs to a feature there must also be great benefits, otherwise the feature would have been eliminated by natural selection. The benefits of high quality, lengthy investments in slow-developing offspring must have been substantial enough to outweigh the costs. Otherwise a prolonged period of ontogeny would have been selected out of our species’ history (e.g., Bjorklund & Bering, 2003; Bjorklund & Pellegrini, 2000, 2002).
Adaptations of infancy and childhood
Understanding the evolutionary function, or benefits, of prolonged development can help us know how and why our psychology develops the way that it does. One proposed function is that this extended period of immaturity is associated with a high degree of plasticity and allows children the time and flexibility necessary to master complex skills, such as those needed for tool use, hunting/foraging, and social learning/cognition (e.g., Bjorklund, Cormier, & Rosenberg, 2005). But, we argue, it is not just time that prolonged youth affords, but an approach toward life that is unique to childhood and that may be crucial in the development of “humanness.”
Those aspects of childhood that serve as preparations for adulthood and were selected over the course of evolution are referred to as deferred adaptations (Bjorklund & Hernndez-Blasi, 2005; Hernndez-Blasi & Bjorklund, 2003). However, some aspects of infancy and childhood are not specific preparations for adulthood. Rather, they are designed by natural selection to adapt the child to its current environment, not necessarily to a future one, and are referred to as ontogenetic adaptations (Bjorklund, 1997; Oppenheim, 1981). From this perspective, aspects of children’s immature functioning are adaptive in their own right, providing infants and children with immediate (and albeit sometimes deferred) benefits.
This is, admittedly, a counterintuitive view, especially among developmentalists. Immature behavior and thought are typically seen as incomplete versions of those of adults. Immaturity is something that children must overcome on their way to adulthood, where the “real show of humanity emerges” (Thomas, 1993, p. 175). We contend that this latter view is inaccurate and insufficient in describing childhood. “Immature” physical, cognitive, and behavioral features served adaptive roles in human phylogeny and continue to have an adaptive role in human ontogeny, and we can take advantage of such immaturity to foster development (Bjorklund, 1997, 2007a; Bjorklund & Green, 1992).
Our intent is not to praise immaturity: Maturity is still the goal of development and artificial prolongation of immaturity would be deleterious. However, we suggest that there may be some adaptive functions for immaturity that coexist with the maladaptive ones, at least at certain times in development. In the following sections we examine several potential benefits to immaturity including the opportunity for play, the adaptive value of neural inefficiency, the benefits of thinking you’re better than you are, and the adaptive value of looking and acting young on others. For a more extensive discussion of the adaptive value of immaturity see Bjorklund (1997, 2007a).
Benefits of Play
Play has been man’s most useful preoccupation.Frank Caplan, educational toy developer
Play is a uniquely adaptive act, not subordinate to some other adaptive act, but with a special function of its own in human experience.Johan Huizinga, Homo ludens
Play is what children do, accounting for between 10 and 40% of children’s time and energy expenditure (see Bjorklund & Pellegrini, 2002; Pellegrini, Horvat, & Huberty, 1998). Play is not limited to humans, but is observed in many other, mainly social, species. For example, juvenile apes, canids (wild dogs), dolphins, and elephants spend large proportions of their time and energy engaging in some forms of play (e.g., see Bekoff, 1997; McCowan, Marino, Vance, Walke, & Reiss, 2000). Play is spontaneous and voluntary, while at the same time engaging and effortful. It might appear “purposeless” or as an imperfect version of adult behavior, but because of its high costs researchers have assumed that the outcomes of play must be beneficial, offering some immediate as well as deferred benefits to the developing organism (see Bjorklund & Pellegrini, 2002; Pellegrini, in press; Pellegrini & Smith,
Affiliation: | a Florida Atlantic University, Boca Raton, Florida, USA |
Keywords: Cognitive immaturity; Evolutionary development; Ontogenetic adaptation |
As the title suggests, our thesis focuses on the benefits of youth, or more generally, on the adaptive nature of immaturity. We view development as more than a “means-to-an-end” process that produces mature adults: While each stage in development, from infancy through childhood and on to adolescence, serves a unique purpose in preparing children for adulthood, these stages also function to adapt children to their immediate environment, or developmental “niche.” This perspective is explicitly informed by evolutionary theory, and before delving into the “benefits of youth,” we examine some of the assumptions of an evolutionary approach to human psychological development.
From the perspective of evolutionary developmental psychology (e.g., Bjorklund & Hernndez-Blasi, 2005; Bjorklund & Pellegrini, 2000, 2002; Burgess & McDonald, 2005; Ellis & Bjorklund, 2005; Geary & Bjorklund, 2000), natural selection operates at all stages of development, not just adulthood. Moreover, natural selection has not acted equally on all stages. To pass on their genes, individuals must survive birth, develop to sexual maturity, find a mate with which to reproduce, and successfully raise offspring to reproductive age. Therefore, selection will have its greatest effects on the early stages of ontogeny, and any benefits that encourage development through these stages will be favored, even if they are harmful later in life. For example, the very sex hormones that promote reproduction in Pacific salmon lead to their rapid post-reproductive death. In fact, if a salmon’s gonads are removed it can no longer reproduce, of course, but lives a much longer life (Robertson, 1961). In humans, there is some evidence that testosterone provides early benefits for males associated with social status and dominance, yet may suppress the immune system and increase risks of cancer and heart disease later in life (see Geary, 1998). Organisms are also at greater risk of death early in life, lacking the skills necessary for independent living. This is especially true for animals with extended developmental periods such as Homo sapiens and other nonhuman primates. Accordingly, mechanisms of evolutionary change focus on the developing organism more so than on the adult.
One characteristic of Homo sapiens is its prolonged period of immaturity. The closer a species’ common ancestor is with Homo sapiens, the longer its period of immaturity: approximately 2 years in lemurs, 4 years in macaques, and 8 years in chimpanzees, compared to approximately 15 years in humans (Poirier & Smith, 1974). According to anthropologist Barry Bogin (1999, 2003), humans’ extended period of immaturity was afforded, in part, by the invention of two new life stages: childhood, approximating the 3 to 7 years between infancy and the juvenile period, and adolescence, a brief 2 to 3 years following menarche, each characterized by distinct physical and psychological characteristics.
All mammal infants are dependent on their parents, especially their mothers, for care, but childhood, in particular, extends the period of high parental investment required to survive. Children cannot acquire or prepare food themselves (they still have their primary, or “baby teeth,” and many foods have to be prepared specially for them), and their physical skills are greatly limited. Moreover, although children display representational (symbolic) thought at this time, they are still largely egocentric (have difficulty considering a perspective other than their own) and demonstrate illogical thinking, which Piaget (1983) described as preoperational (i.e., lacking the logical operations characteristic of children in the juvenile, or concrete operational, period).
This extended period of dependency and delayed sexual maturation can be very costly. Caring for dependent children limits the number of offspring a woman can have, and being dependent on others for many years has its own drawbacks, including susceptibility to predation, starvation, and disease, to name a few. Moreover, the longer one delays sexual maturity, the greater the chance that one will die before reproducing. In biology, when there are great costs to a feature there must also be great benefits, otherwise the feature would have been eliminated by natural selection. The benefits of high quality, lengthy investments in slow-developing offspring must have been substantial enough to outweigh the costs. Otherwise a prolonged period of ontogeny would have been selected out of our species’ history (e.g., Bjorklund & Bering, 2003; Bjorklund & Pellegrini, 2000, 2002).
Understanding the evolutionary function, or benefits, of prolonged development can help us know how and why our psychology develops the way that it does. One proposed function is that this extended period of immaturity is associated with a high degree of plasticity and allows children the time and flexibility necessary to master complex skills, such as those needed for tool use, hunting/foraging, and social learning/cognition (e.g., Bjorklund, Cormier, & Rosenberg, 2005). But, we argue, it is not just time that prolonged youth affords, but an approach toward life that is unique to childhood and that may be crucial in the development of “humanness.”
Those aspects of childhood that serve as preparations for adulthood and were selected over the course of evolution are referred to as deferred adaptations (Bjorklund & Hernndez-Blasi, 2005; Hernndez-Blasi & Bjorklund, 2003). However, some aspects of infancy and childhood are not specific preparations for adulthood. Rather, they are designed by natural selection to adapt the child to its current environment, not necessarily to a future one, and are referred to as ontogenetic adaptations (Bjorklund, 1997; Oppenheim, 1981). From this perspective, aspects of children’s immature functioning are adaptive in their own right, providing infants and children with immediate (and albeit sometimes deferred) benefits.
This is, admittedly, a counterintuitive view, especially among developmentalists. Immature behavior and thought are typically seen as incomplete versions of those of adults. Immaturity is something that children must overcome on their way to adulthood, where the “real show of humanity emerges” (Thomas, 1993, p. 175). We contend that this latter view is inaccurate and insufficient in describing childhood. “Immature” physical, cognitive, and behavioral features served adaptive roles in human phylogeny and continue to have an adaptive role in human ontogeny, and we can take advantage of such immaturity to foster development (Bjorklund, 1997, 2007a; Bjorklund & Green, 1992).
Our intent is not to praise immaturity: Maturity is still the goal of development and artificial prolongation of immaturity would be deleterious. However, we suggest that there may be some adaptive functions for immaturity that coexist with the maladaptive ones, at least at certain times in development. In the following sections we examine several potential benefits to immaturity including the opportunity for play, the adaptive value of neural inefficiency, the benefits of thinking you’re better than you are, and the adaptive value of looking and acting young on others. For a more extensive discussion of the adaptive value of immaturity see Bjorklund (1997, 2007a).
Play has been man’s most useful preoccupation.Frank Caplan, educational toy developer
Play is a uniquely adaptive act, not subordinate to some other adaptive act, but with a special function of its own in human experience.Johan Huizinga, Homo ludens
Play is what children do, accounting for between 10 and 40% of children’s time and energy expenditure (see Bjorklund & Pellegrini, 2002; Pellegrini, Horvat, & Huberty, 1998). Play is not limited to humans, but is observed in many other, mainly social, species. For example, juvenile apes, canids (wild dogs), dolphins, and elephants spend large proportions of their time and energy engaging in some forms of play (e.g., see Bekoff, 1997; McCowan, Marino, Vance, Walke, & Reiss, 2000). Play is spontaneous and voluntary, while at the same time engaging and effortful. It might appear “purposeless” or as an imperfect version of adult behavior, but because of its high costs researchers have assumed that the outcomes of play must be beneficial, offering some immediate as well as deferred benefits to the developing organism (see Bjorklund & Pellegrini, 2002; Pellegrini, in press; Pellegrini & Smith, 2005). As Susanna Millar (1969) stated, “If animals play, this is because play is useful in the struggle for survival; because play practices and so perfects the skills needed in adult life.”
The most common type of play in early life is locomotor play, which includes vigorous movements (e.g., chase games) and rough-and-tumble play (e.g., play fighting). These types of play are often social. Researchers have proposed that such play serves to help young animals learn about their physical and social environments (e.g., Pellegrini & Smith, 1998).
Object play involves the manipulation of objects in the environment. Such “play with objects” may foster tool use and construction (Pellegrini, in press; Pellegrini & Bjorklund, 2004). For example, children who have the opportunity to play with objects are later more likely to successfully use those objects as tools in a toy-retrieval task (e.g., Cheyne & Rubin, 1983; Smith & Dutton, 1979). Related research revealed that young children who engage in more object-oriented play are more successful on a simple tool-use task than children who engage in less such play (Gredlein & Bjorklund, 2005). Peter Smith (2005) argued that play may help prepare children to explore and develop tool-use skills in ways not possible via social learning. When children play, they do so flexibly, engaging in novel behaviors and processes of discovery and creative problem solving. If these new behaviors prove beneficial they may spread throughout the group and eventually become canalized within the species. In this way, play serves an important role in driving changes not only in ontogeny, but also in phylogeny.
Symbolic, or fantasy, play is likely unique to humans (but see Gmez & Martn-Andrade, 2005) and involves pretending. Symbolic play is first seen in children around 18 months and Piaget argued that it was an expression of the symbolic, or semiotic, function. Symbolic play becomes more complex with age, with children taking turns playing roles (sociodramatic play). Such play is usually social, often includes other types of play (e.g., object play, rough-and-tumble play), and serves to help children understand adult roles in their culture, as well as fostering their social cognition (Smith, 2005). Fantasy play increases over the preschool and early school years and then decreases as games—play with rules—increases, which fosters both cooperation and competition among peers.
One thing that distinguishes Homo sapiens from all other animals is that curiosity and play are extended into adulthood. Only humans play throughout life, showing a need for novelty that results in continued learning and behavioral flexibility. This orientation toward play caused the historian Johann Huizinga (1950) to refer to humans as Homo ludens, “playful man.”
While the ideas of the importance of play to human development are neither new nor surprising, some of the other proposed benefits of immaturity may be more counterintuitive. For example, human newborns are neurologically immature and stay that way for some months. This is a necessary result of having to be born too early. If neonates’ brains were more mature, the skull that holds them could not fit through the birth canal of a bipedal woman. This means that much brain development (e.g., synaptogenesis, neuron growth, myelination) occurs postnatally in humans, more so than in any other primate. As a result, the human infant’s nervous system is quite immature compared to that of other primates. How could such brain immaturity and neural inefficiency be advantageous?
First, humans’ prolonged brain growth paired with an extended period of immaturity allows humans to master the skills necessary to live in highly complex and variable societies (e.g., Bjorklund & Bering, 2003; Kaplan, Hill, Lancaster, & Hurtado, 2000). In fact, humans rely more heavily on learning and behavioral flexibility for survival than any other species (Bjorklund & Pellegrini, 2002; Geary, 2005). Humans’ enhanced degree of cognitive and behavioral flexibility is made possible by a brain that maintains a high degree of plasticity long after this is reduced in most other animals (e.g., Jacobson, 1969).
Second, the increased degree of neural (and thus cognitive and behavioral) plasticity permits children to recover from the effects of a deleterious early environment. Through at least the middle of the twentieth century, it was believed that if children suffered severe deprivation for much more than their first year of life, they were destined to a life of mental retardation and psychopathology. Both human and animal work has clearly shown that this is not true. For example, classic research by Harlow, Dodsworth, and Harlow (1965) demonstrated the adverse effects of early social deprivation in rhesus monkeys. Monkeys that were isolated early in life from social contact grew up to be socially and behaviorally aberrant, and the effect appeared to be permanent. However, Suomi and Harlow (1972) showed that the effects of early social isolation could be reversed if the isolated monkeys were given daily social contact with a younger, socially inexperienced peer.
This “rebound” effect has also been well documented in human children living in stultifying institutions. Children reared from birth or shortly thereafter in overcrowded, understaffed orphanages show signs of intellectual and social/emotional retardation, with early studies indicating that such negative effects were permanent (Denis, 1973; Spitz, 1945). Although there was some evidence that the effects of deprivation could be reversed if children were removed from such institutions (Skeels, 1966), more recent work examining institutionalized Romanian children demonstrates that the likelihood of altering patterns of intellectual retardation is related to the age at which children are removed from the institution and placed in more stimulating environments (e.g., Beckett et al., 2006; Nelson et al., 2007; O’Connor et al., 2000). Upon first leaving the institutions, the children displayed physical, social, and intellectual deficits. However, children placed in foster or adoptive homes before the age of 6 months attained IQ scores by age 6 years comparable to those of adopted children who had never been institutionalized. Yet, IQ scores at age 6 and 11 were significantly lower for those children who remained in the institution longer, particularly those who were adopted after 24 months. However, Beckett et al. (2006) reported a slight increase in the IQs of these late-adopted children at age 11, suggesting a catch-up effect for the children who experienced the longest deprivation. Thus, this “rebound” effect was greater the earlier in life the positive change occurred, due to the progressive loss of neural plasticity with age.
Third, inefficient neural processing early in life may protect children from over stimulation. For example, Turkewitz and Kenny (1982) argued that perceptual limitations in infancy are adaptive in that they allow one sensory system to develop without having to compete for neural resources with another, usually later-developing, system. From this perspective, immature sensory systems are not handicaps that must be overcome, but are adaptive and necessary for proper sensory development and sensory learning. The different sensory systems develop in an invariant order for all vertebrates, with audition, for instance, developing before vision (Gottlieb, 1971). Perceptual experiences are correlated with neural development, so that most birds and mammals receive auditory stimulation prior to visual stimulation. This means that early developing senses (e.g., audition) do not have to “compete” for neurons with later developing senses (e.g., vision). When animals receive species-atypical patterns of stimulation, however, it interferes with this choreographed dance between gene-influenced neural maturation and perceptual experience. For example, when the tops of the eggshells of bobwhite quails were removed so that they could perceive patterned light, they showed enhanced levels of visual discrimination abilities upon hatching, but suffered impairments to their auditory attachment behavior (they failed to approach the maternal call of their own species, but rather made no choice or approached the call of a chicken; Lickliter, 1990). Numerous other studies with precocial birds (e.g., Gottlieb, Tomlinson, & Radell, 1989; McBride &Lickliter, 1993) and rats (e.g., Kenny & Turkewitz, 1986; Spear, 1984) have demonstrated the adverse effects that result from decoupling the timing of perceptual experience with neural maturation. Similar effects are also seen in premature human infants, who are exposed to a host of tactile, auditory, and visual stimuli months before the brain “expects” them (e.g., Als, 1995; Lickliter, 2000). For instance, Als (1995) commented that although premature infants frequently show cognitive or perceptual impairments, these are often accompanied by exceptionalities, in mathematics, for example.
The evidence just reviewed indicates that because of the immaturity of the nervous system, sensory experience in excess of the species’ norm can have detrimental effects on the development of the perceptual systems. There is also limited evidence that formal learning experiences early in life may have unintended consequences. For example, Harlow (1959) investigated the effects of early training on rhesus monkeys. Monkeys were given a discrimination-learning task beginning at different ages ranging from 60 to 366 days. The learning task consisted of teaching the monkeys to choose among stimuli that varied on several dimensions (e.g., size, shape, color). After 120 days a more complicated discrimination task was given. Results revealed that the younger the age of the monkey when training began, the worse the monkey performed on later learning tasks. Harlow (1959, p. 472) concluded that: “there is a tendency to think of learning or training as intrinsically good and necessarily valuable to the organism, however that training can either be helpful or harmful, depending upon the nature of the training and the organism’s stage of development.”
As you might imagine, there has been a paucity of research with human infants to evaluate this hypothesis. In one study, Papousek (1977) conditioned infants to turn their heads to the sound of a bell. Infants began initial training either at birth, 31, or 41 days of age. Results were similar to those found by Harlow with monkeys. Infants who began training from birth required more days to learn the task than did infants who began training at an older age. A more recent study examined the effect on language development of educational DVDs and videos designed for infants (Zimmerman, Christakis, & Meltzoff, 2007). Within the last decade or so, visual “media” for infants, including DVDs such as Baby Einstein® and Brainy Baby®, have been developed to enhance cognitive development. Zimmerman and his colleagues (2007) looked at the relationship between DVD/video viewing and receptive language in 8- to 16-month-old infants. They reported that every hour of baby DVD/video that infants watched corresponded to about 6 to 8 fewer words in their receptive vocabularies. Other researchers observed 12-, 24-, and 36-month-old children play both with and without television programs broadcasting in the background. Findings revealed that the quality of children’s play and their focused attention was negatively affected by having the TV on, even if they paid little attention to it (Schmidt, Pempek, Kirkorian, Lund, & Anderson, 2008).
Related to this, there has been substantial controversy about the best way to educate preschool children. It is clear that quality preschool education facilitates children’s intellectual development (e.g., Melhuish et al., 2008), and that these programs are especially beneficial for children from less-advantaged homes (e.g., Karoly, Kilburn, & Cannon, 2005). Debate has centered on the degree to which preschool programs should emphasize children’s “natural” propensities for play (termed developmentally appropriate programs) versus those that stress formal instruction (termed direct-instructional programs; see Bjorklund, 2007b). There seems to be little consistent differences in academic achievement between these types of programs following one year of instruction. However, when longer-term (greater than one year) effects are assessed, differences in academic attainment are typically greater for children in the developmentally appropriate rather than the direction-instructional programs (e.g., Burts et al., 1993; Marcon, 1999). Differences between these types of programs are clearer when motivational and psychosocial factors are considered, and most studies find that children attending developmentally appropriate programs experience less stress, like school better, are more creative, and have less test anxiety than children attending direct-instructional programs (Burts et al., 1993; Stipek, Feiler, Daniels, & Milburn, 1995; Stipek et al., 1998; see Bjorklund, 2007b). In other words, any academic benefits gained from a teacher-directed program had its costs in terms of motivation. According to the authors of one study that contrasted developmentally appropriate and direct-instructional programs, “If it has no clear benefit to the child’s development, and if it may hinder development, there may be no defensible reason to encourage the introduction of formal academic instruction and adult-focused learning during the preschool years” (Hyson, Hirsh-Pasek, & Rescorla, 1990).
With increasing age, children become more in touch with predicting and evaluating their performance. Preschool children, in contrast, are notorious for overestimating their abilities. Younger children compared to older children believe that they have better memories (e.g., Flavell, Friedrichs, & Hoyt, 1970), greater physical abilities (e.g., Plumert & Schwebel, 1997; Schneider, 1998), are more skilled at imitating models (Bjorklund, Gaultney, & Green, 1993), know more about how things work (Mills & Keil, 2004), are smarter (e.g., Spinath & Spinath, 2005; Stipek, 1981), and rate themselves as stronger, tougher, and of higher social standing (e.g., Boulton & Smith, 1990; Humphreys & Smith, 1987) than is actually the case.
There has been some interesting speculations as to why young children have these overestimation biases, one of which is consistent with Bandura’s (1997) ideas about the importance of developing a positive sense of self-efficacy, or a perspective of seeing oneself as being a person in control of one’s life. Young children’s overly optimistic attitudes serve as motivating factors for them to try new tasks that they otherwise might not attempt, and to persist at tasks where a “wiser” or metacognitively more competent child might quit. If young children knew how poorly they really performed these tasks, their sense of self-efficacy could be damaged and they likely would not be so bold in trying new tasks, nor persistent at tasks they do attempt.
Several studies have examined the potentially adaptive nature of young children’s overestimation of their abilities. In an early study from our lab (Bjorklund et al., 1993), 3-, 4-, and 5-year-old children watched a model perform two tasks with different levels of difficulty: juggling 1, 2, or 3 balls, and throwing balls in a basket from three different distances. The children were asked to make both predictions—how well did they think they would perform each task—and postdictions—how well they thought they had performed each task. They were also administered the Vocabulary subtest of the Wechsler Preschool and Primary Scale of Intelligence, which was correlated with their accuracy of predictions and postdictions. As anticipated, each group of children overestimated their abilities, both for predictions and postdictions, although older children overestimated less than younger children. Additionally, the relationship between accuracy of estimation and verbal IQ revealed that the 3- and 4-year-old children with higher IQ scores tended to overestimate more, whereas 5-years-olds who overestimated more tended to have lower IQ scores. This latter pattern is similar to that found for older children (more accurate children have higher IQs; see Schneider, 1985), but the pattern for the 3- and 4-year-olds implies that being out of touch with one’s imitative abilities is associated with greater competence (at least higher IQ scores) for younger children.
In a more recent study, we investigated the effect of overestimating one’s recall on subsequent memory performance (Shin, Bjorklund, & Beck, 2007). Kindergarten, first-, and third-grade children were given five sort-recall memory trials using different sets of categorically related words on each trial. Prior to the study phase on each trial, children were asked to predict how many items they would remember. In addition, we assessed strategies used during the study phase (e.g., rehearsal, sorting, self-testing) and during recall (clustering, recalling items according to category membership). Children’s prediction accuracy was calculated by taking their predicted recall minus their actual recall. Children were then classified into either a high- or low-accuracy group based on their prediction accuracy on the first two trials. Changes in recall from the earlier trials (1 and 2) to the later ones (4 and 5) were then assessed as a function of children’s prediction accuracy. The results for levels of recall, clustering, and number of strategies used are shown in Table 1. As predicted, at all grades, children in the low-accuracy group showed greater gains in recall than children in the high-accuracy group. A similar pattern was found for clustering and number of strategies used for the two younger groups of children, although the opposite pattern was observed for the oldest children. Thus, at least for younger children, overestimating one’s abilities on early trials is associated with greater gains in cognitive performance than for children who are more in touch with their cognitive abilities. These findings are consistent with Bandura’s arguments that children’s overestimation biases foster improvements in their abilities by motivating persistence and promoting self-efficacy.
Kindergarten | Grade 1 | Grade 3 | |
---|---|---|---|
Notes: Entries in bold indicate a statistically significant difference. (Adapted from Shin, Bjorklund, & Beck, 2007.) | |||
Recall | Low > High | Low > High | Low > High |
Clustering | Low = High | Low > High | High > Low |
# Strategies | Low > High | Low > High | High > Low |
In addition to the direct benefits to children afforded by immaturity, aspects of immaturity may have evolved to have indirect effects for children by promoting care giving from adults. This was famously argued by Konrad Lorenz (1943), who proposed that adults’ feelings of nurturance and protection may be provoked by certain universal infantile characteristics. For example, adults rate pictures of large, round, infant-like heads as more “cute” relative to images of adult-like head shapes (e.g., Alley, 1981, Fullard & Reiling, 1976). Likewise, we propose that adults may find young children’s immature psychological characteristics endearing. The selection of a psychological mechanism that processes immature features of infants and young children as preferential and endearing would have aided in the survival of offspring and supported selection of the extended juvenile period as a developmental niche.
Evidence for the adaptive value of cognitive immaturity comes from recent studies from our lab (Bjorklund et al., 2008), which looked at adult perceptions of cognitive immaturity. In a series of three studies, we asked adults to evaluate a series of vignettes containing hypothetical children expressing either immature or mature cognition. Immature cognition was divided into two general types: agentive and nonagentive. In agentive cognition, children express a purposive explanation for some behavior or phenomenon, in that explanations center on some form of intentionality (e.g., Tomasello, 1999).1 For example, 4- and 5-year-old children will say that mountains are “for climbing,” clouds are “for raining,” pointy rocks are “so animals could scratch on them when they get itchy.” In contrast, immature nonagentive cognition does not make inferences about intentionality, but reflects, essentially, poor information processing.
Although the immature statements we used were clearly characteristic of young children, agentive cognition persists into adulthood. For example, older children, adolescents, and adults continue to express animistic beliefs (e.g., “the sun is alive because it gives off heat”; see Looft & Bartz, 1969), adults in all cultures believe in supernatural beings (see Woolley, 1997), and non-schooled adults attribute intention to natural phenomena much as young children do (Casler & Keleman, 2008). Because of this, we thought that adults might judge children expressing such immature cognition positively, relative to children expressing more mature (i.e., scientific) cognition. In contrast, we hypothesized that adults would likely express no positive bias toward immature forms of nonagentive cognition (e.g., failure to inhibit, overestimating one’s ability).
Adults read two vignettes, one of a child expressing immature cognition (e.g., “The sun’s not out today because it’s mad”) and the other expressing mature cognition (e.g., “The sun’s not out because clouds are blocking it”). After reading each pair of vignettes, participants read a series of adjectives or brief statements and were asked to select which child in the two vignettes best reflected each adjective or statement. The adjectives were later grouped into three dimensions: positive affect (cute, endearing, friendly, nice, feel protective of); negative affect (likely to lie, sneaky, feel aggravated with, feel impatient with); and intelligence (smart, intelligent).
The major pattern of results was similar for the three studies, the first with American college students, the second with American college students and parents of young children, and the third with Spanish college students. Figure 1 presents the mean scores for the agentive and nonagentive vignettes for each dimension, averaged across the three studies, with higher scores (toward 1.0) reflecting a greater number of immature children chosen. Although participants consistently chose the mature children as more intelligent for both the agentive and nonagentive vignettes, they chose the immature child significantly more often for positive affect when expressing agentive cognition (“The sun’s not out because it’s mad”). In contrast, children expressing immature nonagentive cognition were more apt to be selected for the negative-affect adjectives. Thus, it seems that some forms of immature cognition, similar to the infantile facial features identified by Lorenz, endear children to adults and protect them from negative feelings.
Developmental psychologists have spent much time examining how children’s social, emotional, and cognitive abilities approach those of the mature adult, and this is a wholly reasonable approach to the study of ontogeny. However, this adult-centric view produces a bias of seeing early, immature, forms and functions as handicaps that must be overcome on the way to maturity. Taking an evolutionary perspective of development causes one to see immaturity a bit differently. Evolution can be seen as a succession of ontogenies—our ancestors evolved, but each also developed, and natural selection must have operated as potently, or more so, on the early stages of ontogeny as on the adult. Humans’ particular life history, with the long journey to sexual maturity and the possible invention of “new” stages (Bogin, 2003), likely made adaptations during the pre-reproductive years especially critical. When viewed from this perspective, many of the physical and psychological features of young humans can be seen in a different light—not as handicaps, but as benefits, enhancing children’s adjustment to their immediate environments or to acquisitions that will facilitate adult functioning.
As we mentioned, an extended period of immaturity has its drawbacks. Immaturity is often maladaptive, and the ultimate goal of development is to grow into physically, cognitively, and socially mature adults. Our purpose here is not to romanticize immaturity, to argue for its prolongation, or to propose that the traditional approach of developmental psychologists, looking to the adult as the “finished product,” is misguided. Rather, our intent is to argue that some aspects of immaturity have adaptive value, either immediate or deferred, and that recognizing this can change how we view ontogeny and help us foster healthy psychological development in children.
1This is reflected in Piagetian concepts such as finalism, in which children believe that natural objects and events must have a specifiable cause, failing to understand that some events happen entirely by chance. It is also seen in animism, in which children attribute living characteristics (and thus intentionality) to inanimate objects.
Kindergarten | Grade 1 | Grade 3 | |
---|---|---|---|
Notes: Entries in bold indicate a statistically significant difference. (Adapted from Shin, Bjorklund, & Beck, 2007.) | |||
Recall | Low > High | Low > High | Low > High |
Clustering | Low = High | Low > High | High > Low |
# Strategies | Low > High | Low > High | High > Low |