Showing posts with label Intelligence. Show all posts
Showing posts with label Intelligence. Show all posts

Tuesday, November 24, 2009

Interview with Keith Stanovich


By Coert Visser

Dr. Keith Stanovich, Professor of Human Development and Applied Psychology of the University of Toronto, is a leading expert on the psychology of reading and on rationality. His latest book, What Intelligence Tests Miss: The Psychology of Rational Thought, shows that IQ tests are very incomplete measures of cognitive functioning. These tests fail to assess rational thinking styles and skills which are nevertheless crucial to real-world behavior. In this interview with Keith Stanovich he explains the difference between IQ and rationality and why rationality is so important. Also he shares his views on how rationality can be enhanced.

In your book, you say that IQ tests are incomplete measures of cognitive functioning. Could you explain that?

I start out my book by noting the irony that in 2002, cognitive scientist Daniel Kahneman of Princeton University won the Nobel Prize in Economics for work on how humans make choices and assess probabilities—in short, for work on human rationality.  Being rational means adopting appropriate goals, taking the appropriate action given one’s goals and beliefs, and holding beliefs that are commensurate with available evidence—it means achieving one’s life goals using the best means possible.  To violate the thinking rules examined by Kahneman and Tversky thus has the practical consequence that we are less satisfied with our lives than we might be.  Research conducted in my own laboratory has indicated that there are systematic individual differences in the judgment and decision making skills studied by Kahneman and Tversky.

It is a profound historical irony of the behavioral sciences that the Nobel Prize was awarded for studies of cognitive characteristics that are entirely missing from the most well-known mental assessment device in the behavioral sciences—the intelligence test, and its many proxies, such as the SAT.  It is ironic because most laypeople are prone to think that IQ tests are tests of, to put it colloquially, good thinking.  Scientists and laypeople alike would tend to agree that “good thinking” encompasses good judgment and decision making—the type of thinking that helps us achieve our goals.  In fact, the type of “good thinking” that Kahneman and Tversky studied was deemed so important that research on it was awarded the Nobel Prize.  Yet assessments of such good thinking—rational thinking—are nowhere to be found on IQ tests.  Intelligence tests measure important things, but not these—they do not assess the extent of rational thought.  This might not be such an omission if it were the case that intelligence was a strong predictor of rational thinking.  However, my research group has found just the opposite—that it is a mild predictor at best and that some rational thinking skills are totally dissociated from intelligence.

You write about three types of thinking processes, the autonomous, the algorithmic and the reflective mind. Could you briefly explain these and explain how they are related to intelligence and rationality?

In 1996, philosopher
Daniel Dennett wrote a book about how aspects of the human mind were like the minds of other animals and how other aspects were not. He titled the book Kinds of Minds to suggest that within the brain of humans are control systems of very different types—different kinds of minds. In the spirit of Dennett’s book, I termed the part of the mind that carries out Type 1 processing the autonomous mind.  The difference between the algorithmic mind and the reflective mind is captured in another well established distinction in the measurement of individual differences—the distinction between cognitive ability and thinking dispositions.  The algorithmic mind is indexed by measures of computational power like fluid g in psychometric theory.  The reflective mind is indexed by individual differences in thinking disposition measures.

The term mindware was coined by psychologist David Perkins to refer to the rules, knowledge, procedures, and strategies that a person can retrieve from memory in order to aid decision making and problem solving. Perkins uses the term to stress the analogy to software in the brain/computer analogy.  Each of the levels in the tripartite model of mind has to access knowledge to carry out its operations.  The reflective mind not only accesses general knowledge structures but, importantly, accesses the person’s opinions, beliefs, and reflectively acquired goal structure.  The algorithmic mind accesses micro-strategies for cognitive operations and production system rules for sequencing behaviors and thoughts. Finally, the autonomous mind accesses not only evolutionarily-compiled encapsulated knowledge bases, but also retrieves information that has become tightly compiled and available to the autonomous mind due to overlearning and practice.
Rationality requires three different classes of mental characteristic. First, algorithmic-level cognitive capacity is needed in order that autonomous-system override and simulation activities can be sustained.  Second, the reflective mind must be characterized by the tendency to initiate the override of suboptimal responses generated by the autonomous mind and to initiate simulation activities that will result in a better response.  Finally, the mindware that allows the computation of rational responses needs to be available and accessible during simulation activities. Intelligence tests assess only the first of these three characteristics that determine rational thought and action.  As measures of rational thinking, they are radically incomplete.

That society, educators, psychologists, and personnel managers put so much emphasis on intelligence seems strange and unjustified given that intelligence tests cover only one of these three important mental processes. Could you say something about how individuals, organizations and, perhaps, society as a whole, might benefit from focusing more on raising rational thinking skills?

The lavish attention devoted to intelligence, raising it, praising it, worrying when it is low, etc., seems wasteful in light of the fact that we choose to virtually ignore another set of mental skills with just as much social consequence—rational thinking mindware and procedures.  Popular books tell parents how to raise more intelligent children, educational psychology textbooks discuss the raising of students’ intelligence, and we feel reassured when hearing that a particular disability does not impair intelligence.  There is no corresponding concern on the part of parents that their children grow into rational beings, no corresponding concern on the part of schools that their students reason judiciously, and no corresponding recognition that intelligence is useless to a child unable to adapt to the world.

I simply do not think that society has weighed the consequences of its failure to focus on irrationality as a real social problem.  These skills and dispositions profoundly affect the world in which we live.  Because of inadequately developed rational thinking abilities—because of the processing biases and mindware problems discussed in my book—physicians choose less effective medical treatments; people fail to accurately assess risks in their environment; information is misused in legal proceedings; millions of dollars are spent on unneeded projects by government and private industry; parents fail to vaccinate their children; unnecessary surgery is performed; animals are hunted to extinction; billions of dollars are wasted on quack medical remedies; and costly financial misjudgments are made.  Distorted processes of belief formation are also implicated in various forms of ethnocentric, racist, sexist, and homophobic hatred.

It is thus clear that widespread societal effects result from inadequately developed rational thinking dispositions and knowledge.  In the modern world, the impact of localized irrational thoughts and decisions can be propagated and magnified through globalized information technologies, thus affecting large numbers of people. That is, you may be affected by the irrational thinking of others even if you do not take irrational actions yourself.  This is why, for example, the spread of pseudoscientific beliefs is everyone’s concern.  For example, police departments hire psychics to help with investigations even though research has shown that their use is not efficacious.  Jurors have been caught making their decisions based on astrology.  Major banks and several Fortune 500 companies employ graphologists for personnel decisions even though voluminous evidence indicates that graphology is useless for this purpose.

Unfortunately, these examples are not rare. We are all affected in numerous ways when such contaminated mindware permeates society—even if we avoid this contaminated mindware ourselves.   Pseudosciences such as astrology are now large industries, involving newspaper columns, radio shows, book publishing, the Internet, magazine articles, and other means of dissemination.  The House of Representatives Select Committee on Aging has estimated that the amount wasted on medical quackery nationally reaches into the billions.  Physicians are increasingly concerned about the spread of medical quackery on the Internet and its real health costs.

It seems that sometimes high rationality can irritate some people. For instance, you can sometimes here people saying things like: "don't be so rational!" Do you think there can be such a thing as being too rational?

Under a proper definition of rationality, one consistent with modern cognitive science, no.  It certainly is possible for a person to be “too logical” but being logical is not synonymous with being rational.  Psychologists study rationality because it is one of the most important human values.  It is important for a person’s happiness and well-being that they think and act rationally.  The high status accorded rationality in my writings may seem at odds with other characterizations that deem rationality either trivial -little more than the ability to solve textbook-type logic problems- or in fact antithetical to human fulfillment -as an impairment to an enjoyable emotional life, for instance. These ideas about rationality derive from a restricted and mistaken view of rational thought—one not in accord with the study of rationality in modern cognitive science.

Dictionary definitions of rationality tend to be rather lame and unspecific (“the state or quality of being in accord with reason”), and some critics who wish to downplay the importance of rationality have promulgated a caricature of rationality that involves restricting its definition to the ability to do the syllogistic reasoning problems that are encountered in Philosophy 101.  The meaning of rationality in modern cognitive science is, in contrast, much more robust and important.  Cognitive scientists recognize two types of rationality:  instrumental and epistemic.  The simplest definition of instrumental rationality, the one that emphasizes most that it is grounded in the practical world, is: Behaving in the world so that you get exactly what you most want, given the resources (physical and mental) available to you.  The other aspect of rationality studied by cognitive scientists is epistemic rationality. This aspect of rationality concerns how well beliefs map onto the actual structure of the world.  The two types of rationality are related. In order to take actions that fulfill our goals, we need to base those actions on beliefs that are properly calibrated to the world.

Although many people feel (mistakenly or not) that they could do without the ability to solve textbook logic problems (which is why the caricatured view of rationality works to undercut its status), virtually no person wishes to eschew epistemic rationality and instrumental rationality, properly defined. Virtually all people want their beliefs to be in some correspondence with reality, and they also want to act to maximize the achievement of their goals.  Psychologist
Ken Manktelow, in his book Psychology of Reasoning, has emphasized the practicality of both types of rationality by noting that they concern two critical things: What is true and what to do. Epistemic rationality is about what is true and instrumental rationality is about what to do.

Nothing could be more practical or useful for a person’s life than the thinking processes that help them find out what is true and what is best to do. This stands in marked contrast to some restricted views of what rationality is (for example, the rationality=logic view that I mentioned above).  Being rational (in the sense studied by cognitive scientists) is NOT just being logical.  Instead, logic (and all other cognitive tools) must prove its worth.  It must show that it helps us get at what is true or helps us to figure out what it is best to do.  My philosophy echoes that of
Jonathan Baron, in his book Thinking and Deciding (4th Edition), when he argues that “the best kind of thinking, which we shall call rational thinking, is whatever kind of thinking best helps people achieve their goals.  If it should turn out that following the rules of formal logic leads to eternal happiness, then it is rational thinking to follow the laws of logic, assuming that we all want eternal happiness.  If it should turn out, on the other hand, that carefully violating the laws of logic at every turn leads to eternal happiness, then it is these violations that we shall call rational” (p. 61).

A similar admonition applies when we think about the relation between emotion and rationality.  In folk psychology, emotion is seen as antithetical to rationality.  The absence of emotion is seen as purifying thinking into purely rational form.  This idea is not consistent with definition of rationality that I (and most other cognitive scientists) adopt.  Instrumental rationality is behavior consistent with maximizing goal satisfaction, not a particular psychological process.  It is perfectly possible for the emotions to facilitate instrumental rationality as well as to impede it.  In fact, conceptions of emotions in cognitive science stress the adaptive regulatory powers of the emotions.  Emotions often get us “in the right ballpark” of the correct response.  If more accuracy than that is required, then a more precise type of analytic cognition will be required.  Of course, we can rely too much on the emotions.  We can base responses on a “ballpark” solution in situations that really require a more precise type of analytic thought.  More often than not, however, processes of emotional regulation facilitate rational thought and action.

Writer
Malcolm Gladwell, in his bestselling book Blink, adopts the folk psychological view of the relation between emotion and rationality that is at odds with the way those concepts are discussed in cognitive science.  Gladwell discusses the famous cases of cognitive neuroscientist Antonio Damasio where damage to the ventromedial prefrontal cortex caused nonfunctional behavior without impairing intelligence.  Gladwell argues that “people with damage to their ventromedial area are perfectly rational.  They can be highly intelligent and functional, but they lack judgment” (2005, p. 59).  But this is not the right way to describe these cases.  In my view, someone who lacks judgment cannot be rational.

In the book, you explain the lack of rationality is associated with three things: 1) an overreliance on the autonomous mind, relying on unconscious heuristics where deliberate thinking would have been asked for, 2) a mindware gap, lack of rational tools, procedures, knowledge, strategies, and 3) being infected with contaminated mindware, which refers to beliefs, rules, strategies, etc that are not grounded in evidence but which are potentially harmful and yet hard to get rid of, like a computer virus. Now, I can imagine that bridging the mindware gap can be accomplished largely by education. The other two seem a bit harder to me. Could you share some ideas about what might help to prevent an overreliance on the autonomous mind and about how to fight contaminated mindware?
You are correct that irrationality caused by mindware gaps is most easily remediable, as it is entirely due to missing strategies and declarative knowledge that can be taught (your category #2 above).  But keep in mind that often category #1 (overriding the tendencies of the autonomous mind) is closely linked because override is most often done with learned mindware, and sometimes override fails because of inadequately instantiated mindware.  In such a case, inadequately learned mindware should really be considered the source of the problem (the line between the two is continuous—As the rule is less and less well instantiated, at some point it is so poorly compiled that it is not a candidate to override the Type 1 response and thus the processing error becomes a mindware gap).

Other categories of cognitive failure are harder to classify in terms of whether they are more dispositional (category #1) or knowledge-like (category #2).  For example, disjunctive reasoning is the tendency to consider all possible states of the world when deciding among options or when choosing a problem solution in a reasoning task.  It is a rational thinking strategy with a high degree of generality.  People make many suboptimal decisions because of the failure to flesh out all the possible options in a situation, yet the disjunctive mental tendency is not computationally expensive.  This is consistent with the finding that there are not strong intelligence-related limitations on the ability to think disjunctively and with evidence indicating that disjunctive reasoning is a rational thinking strategy that can be taught.

The tendency to consider alternative hypotheses is, like disjunctive reasoning, strategic mindware of great generality.  Also, it can be implemented in very simple ways. Many studies have attempted to teach the technical issue of thinking of P(D/~H) [the probability of the observed data given the alternative hypothesis] or thinking of the alternative hypothesis by instructing people in a simple habit.  People are given extensive practice at saying to themselves the phrase “think of the opposite” in relevant situations.  This strategic mindware does not stress computational capacity and thus is probably easily learnable by many individuals. Several studies have shown that practice at the simple strategy of triggering the thought “think of the opposite” can help to prevent a host of the thinking errors studied in the heuristics and biases literature, including but not limited to: anchoring biases, overconfidence effects, hindsight bias, confirmation bias, and self serving biases.

Various aspects of probabilistic thinking represent mindware of great generality and potency.  However, as any person who has ever taught a statistics course can attest (your present author included), some of these insights are counterintuitive and unnatural for people—particularly in their application.  There is nevertheless still some evidence that they are indeed teachable—albeit with somewhat more effort and difficulty than strategies such as disjunctive reasoning or considering alternative hypotheses.  Aspects of scientific thinking necessary to infer a causal relationship are also definitely teachable.  Other strategies of great generality may be easier to learn—particularly by those of lower intelligence.  For example, psychologist
Peter Gollwitzer has discussed an action strategy of extremely wide generality—the use of implementation intentions.  An implementation intention is formed when the individual marks the cue-action sequence with the conscious, verbal declaration: “when X occurs, I will do Y”.  Finally, research has shown that an even more minimalist cognitive strategy of forming mental goals (whether or not they have implementation intentions) can be efficacious.  For example, people perform better in a task when they are told to form a mental goal (“set a specific, challenging goal for yourself”) for their performance as opposed to being given the generic motivational instructions (“do your best”).

We are often making choices that reduce our happiness because we find it hard to predict what will make us happy.  For example, people often underestimate how quickly they will adapt to both fortunate and unfortunate events.  Our imaginations fail at projecting the future. Psychologist
Dan Gilbert cites evidence indicating that a remediating strategy in such situations might be to use a surrogate—someone who is presently undergoing the event whose happiness (or unhappiness) you are trying to simulate.  For example, if you are wondering how you will react to “empty nest” syndrome, ask someone who has just had their last child leave for college rather than trying to imagine yourself in that situation.  If you want to know how you will feel if your team is knocked out in the first round of the tournament, ask someone whose team has just been knocked out rather than trying to imagine it yourself. People tend not to want to use this mechanism because they think that their own uniqueness makes their guesses from introspection more accurate than the actual experiences of the people undergoing the event.  People are simply skeptical about whether other people’s experiences apply to them.  This is a form of egocentrism akin to the myside processing.  Gilbert captures the irony of people’s reluctance to adopt the surrogate strategy by telling his readers: “If you are like most people, then like most people, you don’t know you’re like most people” (p. 229, 2006)

Much of the strategic mindware discussed so far represents learnable strategies in the domain of instrumental rationality (achieving one’s goals). Epistemic rationality (having beliefs well calibrated to the world) is often disrupted by contaminated mindware. However, even here, there are teachable macro-strategies that can reduce the probability of acquiring mindware harmful that is to its host.  For example, the principle of falsifiability provides a wonderful inoculation against many kinds of nonfunctional beliefs.  It is a tool of immense generality.  It is taught in low-level methodology and philosophy of science courses, but could be taught much more broadly than this.

Many pseudoscientific beliefs represent the presence of contaminated mindware.  The critical thinking skills that help individuals to recognize pseudoscientific belief systems can be taught in high-school courses.  Finally, I think that the language of
memetic science itself is therapeutic—a learnable mental tool that can help us become more conscious of the possibility that we are hosting contaminated mindware.  One way that the meme concept will aid in cognitive self-improvement is that by emphasizing the epidemiology of belief it will indirectly suggest to many (for whom it will be a new insight) the contingency of belief.  By providing a common term for all cultural units, memetic science provides a neutral context for evaluating whether any belief serves our interests as humans.  The very concept of the meme will suggest to more and more people that they need to engage in mindware examination.

I recently heard someone say: "I'm just a simple man doing a simple job. What's the harm in me being not so rational?" This made me wonder, is there anything known about what characteristics of a task, role or context determine the criticality of rationality? How can we know when rationality is critical and when it is a bit less important or even completely unimportant?

Your question relates to an issue I have written about in my book
The Robot’s Rebellion.  The simple man with the simple job might be protected from his irrationality by living in a rational cultural, in which he is, in effect, a cultural freeloader. Cultural diffusion that allows knowledge to be shared short-circuits the need for separate individual discovery. In fact, most of us are cultural freeloaders--adding nothing to the collective knowledge or rationality of humanity.  Instead, we benefit every day from the knowledge and rational strategies invented by others.

The development of probability theory, concepts of empiricism, mathematics, scientific inference, and logic throughout the centuries have provided humans with conceptual tools to aid in the formation and revision of belief and in their reasoning about action.  A college sophomore with introductory statistics under his or her belt could, if time-transported to the Europe of a couple of centuries ago, become rich "beyond the dreams of avarice" by frequenting the gaming tables or by becoming involved in insurance or lotteries.  The cultural evolution of rational standards is apt to occur markedly faster than human evolution.  In part this cultural evolution creates the conditions whereby instrumental rationality separates from genetic optimization.  As we add to the tools of rational thought, we add to the software that the analytic system can run to achieve long-leash goals that optimize actions for the individual.  Learning a tool of rational thinking can quickly change behavior and reasoning in useful ways--as when a university student reads the editorial page with new reflectiveness after having just learned the rules of logic.  Evolutionary change is glacial by comparison.

Thus, in an astonishingly short time by evolutionary standards, humans can learn and disseminate--through education and other forms of cultural transmission--modes of thinking that can trump genetically optimized modules in our brains that have been driving our behavior for eons.  Because new discoveries by innovators can be conveyed linguistically, the general populace needs only the capability to understand the new cognitive tools--not to independently discover the new tools themselves.

Cultural increases in rationality itself might likewise be sustained through analogous mechanisms of cumulative ratcheting.  That is, cultural institutions might well arise that take advantage of the tools of rational thought, and these cultural institutions might enforce rules whereby people accrue the benefits of the tools of rationality without actually internalizing the rational tools.  In short, people just learn to imitate others in certain situations or “follow the rules” of rationality in order to accrue some societal benefits, while not actually becoming more rational themselves.

Cultural institutions themselves may achieve rationality at an organizational level without this entailing that the individual people within the organization are themselves actually running the tools of rational thought on their serial mental simulators.

Could you tell me about some of the questions that currently fascinate you? What are some of the research questions you would like to explore in the near future?

I am constantly asked about the possibility of a standardized rational thinking test.  I respond that there is no conceptual or empirical impediment to such an endeavor—just the will, money, and time.  I have begun, in ongoing writings, to sketch out a framework for the assessment of rational thought.

***

Further reading


  • What Intelligence Tests Miss: The Psychology of Rational Thought
  • Stanovich, K. E. (2009, Nov/Dec).  The thinking that IQ tests miss. Scientific American Mind20(6), 34-39. Stanovich_IQ-Tests-Miss_SAM09.pdf
  • Stanovich, K. E. (2009). Distinguishing the reflective, algorithmic, and autonomous minds: Is it time for a tri-process theory? In J. Evans & K. Frankish (Eds.), In two minds: Dual processes and beyond (pp. 55-88). Oxford: Oxford University Press. Stanovich_Two_Minds.pdf
  • Stanovich, K. E. (2009).  Rationality versus intelligence. Project Syndicate.  link
  • Stanovich, K. E, & West. R. F. (2008). On the relative independence of thinking biases and cognitive ability.Journal of Personality and Social Psychology94, 672-695.  JPSP08.pdf

Saturday, May 16, 2009

Can we get smarter? Yes we can!

Review of Nisbett, R. (2009). Intelligence and How to Get It: Why Schools and Cultures Count. New York and London W.W. Norton, 282 pages, $17.79 hardcover

by Coert Visser

Did you read the book The Bell Curve (Hernnstein and Murray, 1994)? Did it make you feel uneasy because you did not (want to) agree with its conclusions but did not exactly know how to refute them? Among the conclusions were (loosely formulated): 1) that intelligence is highly important in many areas of life, 2) that differences in intelligence are largely responsible for societal stratification, 3) that differences in intelligence are largely heritable, and 4) that intelligence gaps between (racial) groups are hard to close (if that is possible at all).

If you feel uneasy about these conclusions read this book by psychologist Dick Nisbett (2009). You will probably like this book because it will provide answers to your questions. Not in a vague way but in a very specific, well reasoned and research based way. Here are some conclusions from the book:
  1. There is no fixed value for the heritability of intelligence. If the environment is very favorable to the growth of development of intelligence, the heritability of intelligence is fairly high, maybe up to 70%. If however the environment is highly variable, differing greatly between individual families, then the environment is going to play the major role in differences in intelligences between individuals (as is the case with the poor).
  2. Aside from the degree to which heritability is important for one group or another in the population, heritability places no limits whatsoever on modifiability,for anybody.
  3. Intelligence is developable and schools can make children smarter, for instance by using computer-assisted teaching and certain types of cooperative learning. Genes play no role at all in race differences in IQ, environmental differences do.
  4. Believing that intelligence is under your control is a great start for developing intelligence.
  5. Certain habits and values in cultures can be highly beneficial for learning and developing intelligence.
  6. Parents can do a lot to increase the intelligence and academic achievement of children (both biological and didactic factors matter).
Intelligence and How to Get It contains many very interesting citations of studies. Here are just a few examples. One example is the work by researchers like Urie Bronfenbrenner, Mike Stoolmiller and Eric Turkheimer, whose combined studies show how the famous twin studies systematically overestimate estimates of heritability. Another interesting example is the description of the famous Flynn-effect which shows how IQ-scores can increase rapidly over generations Also the book mentions the work by Carol Dweck, on fixed and growth mindsets. A fixed mindset is a way of viewing intelligence (and other personal characteristics) as unchangeable; either you’ve got it or you don’t. A growth mindset is one in which personal characteristics are viewed as modifiable. Dweck’s work shows that a fixed mindset leads to disregarding learning while a growth mindset leads to the tendency to put effort into learning and performing and into developing strategies that enhance learning and long term accomplishments. The book contains many more interesting findings, for instance about effective educational interventions, including evidence for which strategies work well in raising kids to be intelligent, strategies for bridging performance gaps between different ethnic groups, and more.

I think the content of this book will resonate well with many SF practitioners and researchers. This is why. In the SF approach a dynamic rather than a static view of personal characteristics is held. Às Thorana Nelson and Frank Thomas (2007), authors of Handbook of Solution-Focused Brief Therapy, remind us: “Change is constant and inevitable; just as one cannot not communicate, one cannot not change.” (p. 10) This optimism about change is applicable both to one’s personal circumstances as to one’s behavior and characteristics. This is why a growth mindset fits better with an SF approach than a fixed mindset.

Until now, a dominant view in psychology had been that characteristics like intelligence and personality traits were largely unmodifiable. But the shift that now seems to be happening is that psychologists are discovering more and more that they have been too pessimistic and deterministic. People are far more capable of development than psychology has long thought. A case in point is the human brain. Scientists had long thought that the adult brain was incapable of significant structural change. Now, it has been proven that the brain is far more flexible than that and it is beyond dispute that the brain constantly changes itself as a consequence of experience. This phenomenon is called neuroplasticity. It is even possible for the brain to relocate brain activity associated with a certain function from one area to another, for instance in the case of brain damage. What is also now proven and was long thought to be impossible is neurogenesis, the generation of new cells in the adult brain. Researcher Tracy Shors (2009) and her colleagues have shown that thousands of new cells are created every day.

Beside the shift to a more optimistic view, there also seems to be a shift in psychology’s attention from a purely individualistic to a more systems oriented view of human functioning. Traditional ‘hereditarians’ downplayed the role of the environment, of efforts of schools, parents and society. The view presented in this book acknowledges the importance of such environmental factors. This is an example of how psychology may shift from a rather individualistic to a more interactional and situational perspective.

The research based perspective offered in this book allows for an optimistic stance about educational and societal issues. It justifies an attitude of not giving up in trying to improve efforts to design better learning environments, educational designs and teaching approaches. In several ways the book justifies the optimistic, interactional and contextual view on human functioning that SF practice uses.

This book is great. Let's hope it will inspire many parents, educators, policymakers and scientists. It has the potential.
References
  • Hernnstein, R.J. and Murray, C. (1994). The Bell Curve: Intelligence and class structure in American life. New York: Free Press.
  • Nelson, T. & Thomas, F. (Eds), (2007). Handbook of solution-focused brief therapy: Clinical Applications. New York: The Haworth Press.
  • Shors, T.L. (2009). Saving new brain cells. Scientific American, March 2009. http://www.sciam.com/article.cfm?id=saving-new-brain-cells
Coert Visser is a solution-focused trainer, coach, blogger and author. He can be reached at coert.visser@planet.nl

Monday, November 5, 2007

The True Nature of Intelligence

© 2004, Coert Visser

How do we view intelligence?
Both laymen and experts use widely varying definitions of intelligence. Try and ask your friends and colleagues what they think intelligence is. You will probably get references to solving problems, being able to adapt, quick thinking, quick learning, being creative, being smart, reasoning logically, being sensible, analytic qualities, and so on. Interestingly, although the variation in answers is great, most layman and experts seem to agree on certain aspects of intelligence.


Most people implicitly or explicitly assume that intelligence has the following three characteristics:
  1. Intrapersonal: intelligence is a characteristic of individuals. In other words: it is intrapersonal. It is inside you and is indissoluble from you as an individual. Personnel selection psychologists often base their advices to a large degree on individual measurements of intelligence. Laymen too view intelligence mainly as something that is inside the person.
  2. One-dimensional: both laymen and experts acknowledge that different dimensions or aspects of intelligence can be distinguished but both groups treat intelligence mainly as if it were a one-dimensional concept. Selection psychologists speak of the so-called G-factor, the general intelligence factor and summarize the findings of intelligence measurements into a single (IQ-)score, while laymen too implicitly talk about intelligence as if it were one thing (“She has a high intelligence.”)
  3. Unchangeable: intelligence is a characteristic that is mostly unchangeable from the age of about 17. The assumed unchangeability, or stability, of intelligence implies that people keep the same intelligence level both across different situations and at different ages.
    In short: it is inside you, it is one thing and it is largely unchangeable.
Additional views
Experts on intelligence base their convictions on an impressive amount and quality of thinking and research (for an example see the website of Linda Gottfredson). What follows is not an attempt to attack the traditional view on intelligence but an attempt to provide a complementary view.

Intelligence can be seen as intrapersonal, one-dimensional and unchangeable but also as:
  1. Interpersonal: intelligence does not need to be seen only as something that is inside the head of the individual but can also be seen as something that emerges between people when they co-operate. This view makes opens the possibility that intelligence also happens between people. Every time when two people deliver intellectual performances that they could not have accomplished on their own, we see an example of the interpersonal aspect of intelligence. Hard to imagine? Think about this. The human brain is a network of approximately 100 billion brain cells (neurons) of different kinds that each are connected to very many other neurons. It all adds up to an estimated total of 100 trillion connections. Although the brain is capable of impressive intellectual feats, the neurons of which it is built are not very intelligent. The intelligence of people is not in the neurons but in the connections between the neurons, so between the neurons, or in the network. The comparison between the brain and co-operating people should not be taken too far, if it were only because brains are unimaginably more complex that even the most complex organization. But the analogy does make it easier for us to imagine organizations as networks of interconnected people in which the value and intelligence of the organization is not solely in the people but also between the people. It makes it easier to think in terms of a collective intelligence.
  2. Multidimensional: Intelligence does not have to be viewed only as something that is general and one-dimensional but can also be seen as a complex of a set of dimensions (see Sternberg, 1985). I am not pleading for a rather great stretching of the intelligence concept (like Gardner, 1991 does) by also labeling phenomena as athletic ability as a kind of intelligence. Instead, I would propose to reserve the word intelligence to the cognitive domain. But also within this domain there are different relevant dimensions to be distinguished. One of the most convincing models I find to be the one by David Perkins (1995) who distinguishes as important dimensions: 1) Neural intelligence. This intelligence reflects the general information processing capacity of the person, an aspect of intelligence that may touch on the G-factor, 2) Experiential intelligence. Intelligence that is based on experiences and that are manifested both explicitly and implicitly. You could call this a domain-specific or situational intelligence, 3) Reflective intelligence. This refers to tactics and techniques that you can apply to make use of your neural and experiential intelligence as effectively and efficiently as possible. You might call this meta-intelligence or strategic intelligence.
  3. Developable: viewing intelligence as a multidimensional phenomenon opens the possibility to see it as developable. While the G-factor indeed seems to be fixed or hardly developable, the other important dimensions do seem to be developable. Experiential intelligence can be very well be developed (although this process goes very slowly). Reflective intelligence can even be developed quite quickly (Perkins, 1995).
Practical implications
Although both laymen and experts (sometimes) acknowledge that intelligence is to a certain degree interpersonal, multidimensional and developable, they don’t seem to use these views in practice a lot. If it is true that intelligence is also interpersonal, multidimensional and developable than there are important practical implications. Below are two examples.

Personnel selection: more interactive, dynamic and situational
The selection psychologist would not only be interested in measuring and reporting ‘the’ intelligence of the applicant but also in the following aspects. How well does this applicant complement the collective intelligence of the team? In order to be able to say something about this an individual measurement would not be sufficient. There will have to be some kind of interaction between applicant and organization to assess the ‘chemistry’. Beside a measurement of general intellectual abilities an assessment would be made of other aspects of intelligence like relevant domain-specific experiential intelligence and meta-aspects like problem-solving strategies, thinking models, tactics, and so forth. If these views would be taken into account a selection process would be devised more interactively, more dynamically and more situationallly.

View intelligence as a developable potential
For laymen too, it is important how they view and treat intelligence. Research by Carol Dweck (2002) has demonstrated that what people think about their own intelligence has far-reaching consequences. Dweck shows that people who see intelligence as unchangeable develop a tendency to focus on proving that they have that characteristic instead of focusing on the process of learning. This disregard of the learning process hinders them in the development of their learning and in their performance. This means that the wrong convictions about intelligence can make smart people dumb! But there is hope: when people view intelligence as a potential that can be developed this leads to the tendency to put effort into learning and performing and into developing strategies that enhance learning and long term accomplishments. An implication is that it pays off to help children and students invest in a view of intelligence as something that can be developed.

That the way we look upon phenomena can have drastic consequences has been known for a long time. It has now been demonstrated that the same goes for intelligence. A too restrictive definition of intelligence leads to practical limitations and problems. A realistic view on intelligence makes it possible to get rid of at least some of these restrictions and problems.

Literature
  1. Dweck, C. S. (2002). Beliefs that make smart people dumb. In: Sternberg (2002). Why Smart People can be so stupid. Yale University Press, New Haven & London.
  2. Gardner, H. (1991). Multiple intelligences. New York: Free Press.
  3. Perkins, D.N. (1995). Outsmarting IQ: The emerging science of learnable intelligence. New York: Free Press.
  4. Sternberg, R.J. (1985). Beyond IQ: A triarchic theory of human intelligence. New York: Cambridge University Press.