Tversky, Amos

From New World Encyclopedia
Line 184: Line 184:
 
==Major publications==
 
==Major publications==
 
*Tversky, Amos, C. H. Coombs, and Robyn Dawes. 1970. ''Mathematical psychology: An elementary Introduction''. Englewood Cliffs, NJ: Prentice-Hall.
 
*Tversky, Amos, C. H. Coombs, and Robyn Dawes. 1970. ''Mathematical psychology: An elementary Introduction''. Englewood Cliffs, NJ: Prentice-Hall.
*Tversky, Amos and Daniel Kahneman. 1974. Judgment under uncertainty: Heuristics and biases.
+
*Tversky, Amos and Daniel Kahneman. 1974. Judgment under uncertainty: Heuristics and biases. ''Science'' 185(4157): 1124-1131.
''Science'' 185(4157): 1124-1131.
 
 
*Tversky, Amos, and Daniel Kahneman. 1979. Prospect theory: An analysis of decision making under risk. ''Econometrica'' 47(2): 263-292. (Note: This is the most cited article in the history of this premier economic journal.)   
 
*Tversky, Amos, and Daniel Kahneman. 1979. Prospect theory: An analysis of decision making under risk. ''Econometrica'' 47(2): 263-292. (Note: This is the most cited article in the history of this premier economic journal.)   
 
*Tversky, Amos, Daniel Kahneman, and Paul Slovic. 1981. ''Judgement under uncertainty: Heuristics and biases''. Cambridge, UK: Cambridge University Press. ISBN 0521284147
 
*Tversky, Amos, Daniel Kahneman, and Paul Slovic. 1981. ''Judgement under uncertainty: Heuristics and biases''. Cambridge, UK: Cambridge University Press. ISBN 0521284147

Revision as of 22:45, 17 February 2009

Amos Tversky
BornMarch 16 1937(1937-03-16)
DiedJune 2 1996 (aged 59)
EducationUniversity of Michigan
OccupationEconomist, Behavioral Economics, Psychologist
Known forProspect theory

Amos Nathan Tversky, (Hebrew: עמוס טברסקי;) (1937, 1996 ) was a cognitive and mathematical psychologist, and a pioneer of cognitive science, a longtime collaborator of Daniel Kahneman, and a key figure in the discovery of systematic human cognitive bias and handling of risk.

Life

Amos Tversky was born in Haifa, Israel, on March 16, 1937, to parents who had emigrated from Poland and Russia. His father, Yosef Tversky, was a veterinarian and his mother, Genia, was a member of the Knesset from its establishment in 1948 until her death in 1964. Tversky was an officer in the paratroopers, an elite unit, eventually rising to captain and serving in three wars. He became a "legend" within his unit, after he was awarded Israel's highest honor for personal bravery during a 1956 border skirmish.

Tversky earned a bachelor's degree from Hebrew University in 1961 and his doctorate in 1965 from the University of Michigan, where he met and married his wife, Barbara, a fellow student in cognitive psychology. Barbara Tversky became a professor of cognitive psychology at Stanford.

After holding several teaching positions at Michigan, Harvard, and Hebrew University, Amos Tversky went to Stanford in 1970 as a fellow at the Center for Advanced Study in the Behavioral Sciences. He joined the Stanford psychology faculty in 1978. At the time of his death in 1996, he was the Davis Brack Professor of Behavioral Sciences in the Department of Psychology.

When he won a five-year MacArthur Foundation fellowship in 1984, Tversky said with typical modesty that much of what he had studied was already known to "advertisers and used car salesmen." His theoretical modeling, however, elucidated the basis for such phenomena as consumers becoming upset if a store charged a "surcharge" for using a credit card but being pleased if a store offered a "discount" for paying with cash.

Tversky was elected to the American Academy of Arts and Sciences in 1980 and as a foreign associate of the National Academy of Sciences in 1985. He won the American Psychological Association's award for distinguished scientific contribution in 1982. He also was awarded honorary doctorates by the University of Chicago, Yale University, the University of Goteborg (Sweden) and the State University of New York at Buffalo. He served on Stanford's Faculty Senate from 1990 until his death and was a member of the Academic Council's advisory board to the president and provost.

Amos Tversky died in 1996 from metastatic melanoma.

Work

Tversky's professional ideas and contributions revolutionized not only his own field of cognitive psychology (Tversky 1970), but that of economics as well. Much of his early work concerned the foundations of measurement. He was co-author of a three-volume treatise, Foundations of Measurement (Tversky et al 1971, 1989, 1990). His early work with Daniel Kahneman focused on the psychology of prediction and probability judgment. Later, he and Kahneman originated prospect theory to explain irrational human economic choices. Tversky also collaborated with Thomas Gilovich, Paul Slovic, and Richard Thaler on several key papers.

Tversky's way of thinking established, signifies, and outlined the new meaning he brought into the study of political science. There are three specified foci:

  • judgment under uncertainty (Tversky and Kahneman 1972)
  • decision-making under risk (Tversky and Kahneman 1979), and
  • reason-based choice (Tversky and Kahneman 1981).

Two noteworthy points emerge from review and analysis of his work:

First, Tversky's work stresses the importance of reason-based choice, whereby individuals actively seek to generate, understand, and justify their decisions.

Second, Tversky's work suggests that people do not act even as if' they were the value-maximizers they are purported to be by more rationally based theories, such as expected utility. Rather, individuals function as problem-solvers who creatively construct their choices and resolve complex problems which require trade-offs between values and goals. In this way, preferences are created, rather than elicited, within the process and context of choice itself (McDermott 2001).

Approach to cognitive science

Tversky's early work on judgment began in Israel with another Israeli-born psychologist, Daniel Kahneman, who became his close friend and long-time collaborator. They detailed 11 "cognitive illusions," or biasing characteristics of human judgment, and proposed systematic explanations for them (Tversky and Kahneman 1974). This publication triggered a "cascade of related research," as Science News wrote in a 1994 article that was tracing the history of research on reasoning. Decision theorists in economics, business, philosophy, and medicine as well as psychologists cited their work.

Tversky's later work on decision making, some of it with Kahneman, showed how people make choices under conditions of uncertainty. His experiments and theories were so precise and broadly applicable that teachers can make up their own experiments, try them in any class and be reasonably certain the students will demonstrate the judgment pattern that Tversky has specified.

In psychology, heuristics are simple, efficient rules of thumb that people use to make decisions, typically when facing complex problems or incomplete information (Tversky 1972). These rules work well under most circumstances, but in certain cases lead to systematic cognitive biases. For instance, people may tend to perceive more expensive beers as tasting better than inexpensive ones. This finding holds even when prices and brands are switched; putting the high price on the normally relatively inexpensive brand is enough to lead experimental subjects to perceive that beer as tasting better than the beer that is normally relatively expensive. This is known as the "price infers quality" bias.

Tversky was an astute observer of how people made decisions and was good at explaining his ideas to others.

"One of his most beautiful pieces in recent years started out from observations he made at a Stanford faculty meeting," said Kahneman. Tversky observed that the faculty wanted to make appointment offers to two people for two vacancies but had decided to hold up the offer to one until after they had made the offer to the first, whom they wanted more. When Tversky pointed out that there was no reason to hold off on the offer to the second (since two positions were available), the committee made both offers. Tversky went on to demonstrate similar behavior in a number of laboratory experiments.

Two examples of his approach to “perceived” probability follows:

Example A

Tversky (1972) conducted an experiment using the following story:

Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations.

Participants were then asked to rank statements by their probability. Among the sentences were the following:

(1) Linda is a bank teller.
(2) Linda is a bank teller and is active in the feminist movement.

Based on probability theory, statement (1) is more probable than statement (2), because statement (2) involves a conjunction of two statements. However, participants almost always (86 percent) ranked (2) over (1), indicating that they thought it was more probable. This is the so-called "conjunction fallacy." The standard explanation is that given her description, Linda is more representative of feminist bank tellers than bank tellers in general. When people refer to the conjunction fallacy, or the Kahneman and Tversky experiment, they often call the results the Linda the "Feminist Bank teller problem/experiment."

Example B

Tversky and his co-authors (Tversky et al. 1985) investigated the common misperception among basketball fans and players alike, that players tend to get "hot," that is, they sometimes hit a string of shots that is markedly longer than would be predicted on the basis of their overall shooting percentage. Interviews with players revealed that they regularly passed the ball to a teammate who had made a series of consecutive shots so that he could shoot again, believing that he was on a "hot" streak. Analyzing the data for individual members of the 1980-1981 Philadelphia 76ers (including the famous "hot streak" player Andrew Toney), Tversky found no statistical evidence of this "hot hand" phenomenon. Tversky and his colleagues also examined data from the Boston Celtics during the 1980-1981 and 1981-1982 seasons. The data failed to show that any of their players showed a tendency to have the success of their first shot affect the success of a subsequent shot (McDermott 2001).

In this case, Tversky explained that fans and players are merely observing the standard laws of chance in action, but in the context of their misconception of how the random distributions work. People expect the outcomes of a process determined by chance, such as a coin toss with its 50 percent distribution of heads, to apply to each and every segment of the process. Thus, while it is not uncommon to observe four heads in a row out of 20 flips of the coin, observers assess such a result as non-random (a "hot streak"), since they expect to see each new toss produce the opposite outcome so that the 50 percent distribution is maintained in every pair of tosses. In fact, the laws of chance state that this is an unlikely outcome, and that a string of several heads (or making four successful shots in basketball) is more likely (McDermott 2001).

Prospect theory

Amos Tversky called the studies he carried out with Daniel Kahneman (Tversky and Kahneman 1979) on how people manage risk and uncertainty Prospect theory for no other reason than that it is a catchy, attention-getting name. Their theory, however, developed over a thirty year period, is highly significant in economics and especially in financial economics, the branch of economics concerned with the workings of financial markets, such as the stock market, and the financing of companies, addressing questions framed in terms of factors such as time, uncertainty, options, and information.

Tversky started their research investigating apparent anomalies and contradictions in human behavior. Subjects when offered a choice formulated in one way might display risk-aversion but when offered essentially the same choice formulated in a different way might display risk-seeking behavior. For example, people may drive across town to save $5 on a $15 calculator but not drive across town to save $5 on a $125 coat (Bernstein 1996).

One very important result of Tversky and Kahneman's work is demonstrating that people's attitudes toward risks concerning gains may be quite different from their attitudes toward risks concerning losses. For example, when given a choice between receiving $1000 with certainty or having a 50 percent chance of receiving $2500, people may choose the certain $1000 over the uncertain chance of $2500 even though the mathematical expectation of the uncertain option is $1250. This is a perfectly reasonable attitude that is described as "risk-aversion." However, Kahneman and Tversky found that the same people when confronted with a certain loss of $1000 versus a 50 percent chance of no loss or a $2500 loss often chose the risky alternative. This is called "risk-seeking" behavior. This is not necessarily irrational but it is important for analysts to recognize the asymmetry of human choices (Bernstein 1996).

Peter Bernstein (1996), in a study based on prospect theory, reported some interesting results regarding estimates (in this case by 120 Stanford graduates) of the probability of dying from various causes.

Estimates of Probabilities of Death From Various Causes
Cause Subject Estimates Statistical Estimates
Heart Disease 0.22 0.34
Cancer 0.18 0.23
Other Natural Causes 0.33 0.35
All Natural Causes 0.73 0.92
Accident 0.32 0.05
Homicide 0.10 0.01
Other Unnatural Causes 0.11 0.02
All Unnatural Causes 0.53 0.08


The above table represents the probability estimate of one group in the study. Another group was not asked to estimate the probabilities for separate causes but only the probability of death by natural versus unnatural causes. The probability estimate of a natural death by this second group was 0.58, significantly lower than when the subjects considered each cause separately. The second group's estimate of an unnatural death was 0.32, again significantly lower than for the first group. The most notable aspect of the estimates is that the subjects significantly underestimated the probabilities for natural causes and vastly overestimated the probabilities for unnatural causes. This indicates that probably people give more attention to worrying about the unnatural dangers and not enough to the natural dangers (Bernstein 1996).

Economics and law

The core of the economic analysis of law is a microeconomic approach defined by the Coase Theorem, attributed to Ronald Coase, a neoclassical economist of the Chicago school of economics. This describes the economic efficiency of an economic allocation or outcome in the presence of externalities. In this theorem the model of the individual is homo economicus:

All human behavior can be viewed as involving participants who ... maximize their utility from a stable set of preferences and accumulate an optimal amount of information and other inputs in a variety of markets (Becker 1998, 3-4).

The Coase Theorem, which predicts how economically rational individuals will behave in free markets, is particularly important to legal rules and procedures. It asserts that, in the absence of transaction costs, no matter on whom the costs or liabilities of engaging in an activity are imposed, the parties will bargain to achieve the socially optimal level of that activity (Brest 2006).

Some, such as Herbert Simon, acknowledged that the choices made by individuals cannot in actuality be predicted based on rationality, but rather their choices should be described as "boundedly rational." However, if humans were thought of as only "boundedly rational," the bounds were not very constraining and, more importantly, they did not bias decisions in any systematically predictable manner.

The neoclassical approach held sway until the work of Tversky and Kahneman. Although their "behavioral law and economics" has not replaced the neoclassical tradition—and indeed has been strongly resisted by the neoclassicists—it has nevertheless been applied to virtually every area of the law, both to explain behavior and prescribe normative substantive and procedural rules (Brest 2006).

A purely cognitive explanation of the phenomenon has been supplemented by one that focuses on affect. Cass Sunstein, a professor of law at Chicago, coined the term "probability neglect" to describe how, when emotions run high, for example, when contemplating dreadful risks, people tend to greatly overweight probabilities or to ignore them altogether and focus only on the horrific, worst-case outcome. Indeed, affect has come to play an increasingly important role in behavioral economics, for example, in understanding dynamically inconsistent preferences or hyperbolic discounting. Although this was not the major focus of Amos Tversky's work, the lines of thought certainly trace back to him, for example through the pioneering work in this area of his collaborator Paul Slovic (Brest 2006).

Example

Imagine you are a member of a jury judging a hit-and-run driving case. A taxi hit a pedestrian one night and fled the scene. The entire case against the taxi company rests on the evidence of one witness, an elderly man who saw the accident from his window some distance away. He says that he saw the pedestrian struck by a blue taxi. In trying to establish her case, the lawyer for the injured pedestrian establishes the following facts:

  1. There are only two taxi companies in town, 'Blue Cabs' and 'Black Cabs'. On the night in question, 85 percent of all taxis on the road were black and 15 percent were blue.
  2. The witness has undergone an extensive vision test under conditions similar to those on the night in question, and has demonstrated that he can successfully distinguish a blue taxi from a black taxi 80 percent of the time.

Typical jurors, faced with eye-witness evidence from a witness who has demonstrated that he is right four times out of five, are inclined to declare that the pedestrian was indeed hit by a blue taxi, and assign damages against the Blue Taxi Company. Indeed, if challenged, they might say that the odds in favor of the Blue Company being at fault were exactly four out of five, those being the odds in favor of the witness being correct on any one occasion.

However, in actuality the facts are quite different. Based on the data supplied, the mathematical probability that the pedestrian was hit by a blue taxi is only 0.41, or 41 percent, less than half. In other words, the pedestrian was more likely to have been hit by a black taxi than a blue one. The error in basing a decision on the accuracy figures for the witness is that this ignores the overwhelming probability, based on the figures, that any taxi in the town is likely to be black. If the witness had been unable to identify the color of the taxi, but had only been able to state—with 100 percent accuracy—that the accident was caused by a taxi, then the probability that it was a black taxi would have been 85 percent, the proportion of taxis in the town that are black. So before the witness testified to the color, the chances were low—namely 15 percent—that the taxi in question was blue. This figure is generally referred to as the "prior probability," the probability based purely on the way things are, not the particular evidence pertaining to the case in question. When the witness then testified as to the color, that evidence increased the odds from the 15 percent prior probability figure, but not all the way to the 80 percent figure of the witness's tested accuracy. Rather the reliability figure for the witness's evidence must be combined with the prior probability to give the real probability. The exact mathematical manner in which this combination is done is known as Bayes' law, which results in probability of to be 41 percent (Devlin 1997).

Comparative ignorance

Tversky and Fox (1995) addressed ambiguity aversion, the idea that people do not like ambiguous gambles or choices with ambiguity, with the comparative ignorance framework. Their idea was that people are only averse to ambiguity when their attention is specifically brought to the ambiguity by comparing an ambiguous option to an unambiguous option. For instance, people are willing to bet more on choosing a correct colored ball from an urn containing equal proportions of black and red balls than an urn with unknown proportions of balls when evaluating both of these urns at the same time. However, when evaluating them separately, people were willing to bet approximately the same amount on either urn. Thus, when it is possible to compare the ambiguous gamble to an unambiguous gamble people are averse, but not when ignorant of this comparison.

Another common mistake in reasoning that Tversky discovered is the tendency to assess the frequency of a given event by how easy it is to think of examples of that event. Most people will estimate that there are more English words that begin with the letter k than those whose third letter is k—even though the opposite is true— - simply because it is easier to think of examples of the former (Brest 2006).

Legacy

There is a famous quote from Tversky:

It's what I do for a living: If you look at people as intuitive scientists, you find that we are very good at pattern generation, we are very good at generating hypotheses. It's just that we are not very good at all at testing hypotheses (Kolata 1996).

With this in mind there is no wonder that most of the time Tversky's role in reshaping the cognitive sciences was in convincing economists to pay attention to what people actually do instead of what they would do if they behaved rationally. When he won a five-year MacArthur Foundation fellowship in 1984, Tversky said with typical modesty that much of what he had studied was already known to "advertisers and used car salesmen but not to economists."

Here is an example of what he was talking about. If there is a 10 percent chance that a customer will buy the blue Chevrolet when the salesman shows only that one car, then there is a greater than 10 percent chance that the customer will buy it when the salesman shows both the blue Chevrolet and a green Ford that is less desirable to the customer but at same price as the Chevrolet. A good car salesman would never show just one car (Lowenstein 1966).

"e certainly changed my life, applying just the concept that people's reasoning is imperfect, susceptible to error and amenable to corrective procedures," said Donald Redelmeier, a physician at the University of Toronto who did research with Tversky.

Tversky's and Kahneman's work on "framing," the idea that small differences in how data are presented to people have a substantial effect on their decisions, has influenced the way doctors view informed consent from patients for medical procedures. Barbara McNeil worked with Tversky on what she refers to as the 1 percent-99 percent problem. One and 99 add up to 100, so that if someone is told he or she has a 1 percent chance of dying during a given medical procedure, that person also has a 99 percent chance of living. Their studies showed that people will be more optimistic or pessimistic about the procedure, depending upon which way the information was stated to them. "At a time when medical technology has advanced and patients are being asked to make more decisions about medical options, this is even more important than it was in 1980" when the research was done, McNeil said ( <960605tversky.sit>.)

Daniel Kahneman received the Nobel Prize in 2002 for the work he did in collaboration with Amos Tversky, who would have no doubt shared in the prize had he been alive. Kahneman actually devoted substantial part of his acceptance speech to Tversky expressing regret that his longtime collaborator Amos Tversky wasn't able to share the distinction with him. A parallel would be just as Beatles fans couldn't fully appreciate the 1997 knighting of Sir Paul McCartney in the absence of the late John Lennon, so the announcement of 2002 Nobel Memorial Prize in Economic Science felt incomplete.

To summarize, Amos Tversky, a cognitive psychologist, was a dominant figure in decision research and a leading psychological theorist who seriously challenged economic theory by showing that people frequently do not behave rationally to maximize their welfare. His work on the limits of human rationality also had a major impact on philosophy, statistics, political science, law, and medicine.

His work had a great impact on economics, said Kenneth Arrow, professor emeritus of economics, because he tested hypotheses of rationality that are central to predicting how economies behave:

The hypothesis of rational behavior has been central to economics, though always held with some discomfort ... Previous criticism of economic postulates by psychologists had always been brushed off by economists, who argued, with some justice, that the psychologists did not understand the hypotheses they criticized. No such defense was possible against Amos' work. (Stanford University News Service 1996).

Major publications

  • Tversky, Amos, C. H. Coombs, and Robyn Dawes. 1970. Mathematical psychology: An elementary Introduction. Englewood Cliffs, NJ: Prentice-Hall.
  • Tversky, Amos and Daniel Kahneman. 1974. Judgment under uncertainty: Heuristics and biases. Science 185(4157): 1124-1131.
  • Tversky, Amos, and Daniel Kahneman. 1979. Prospect theory: An analysis of decision making under risk. Econometrica 47(2): 263-292. (Note: This is the most cited article in the history of this premier economic journal.)
  • Tversky, Amos, Daniel Kahneman, and Paul Slovic. 1981. Judgement under uncertainty: Heuristics and biases. Cambridge, UK: Cambridge University Press. ISBN 0521284147
  • Tversky, Amos and Daniel Kahneman. 1981. The framing of decisions and the psychology of choice. Science 211: 453-58.
  • Tversky, Amos, T. Gilovich, and R. Vallone 1985. The hot hand in basketball: On the misperception of random sequences. Cognitive Psychology 17: 295-314.
  • Tversky, Amos, D. E. Bell, and H. Raiffa. 1988. Decision making: Descriptive, normative, and prescriptive interactions Cambridge, UK: Cambridge University Press.

References
ISBN links support NWE through referral fees

  • Becker, G. S., The Economic Approach to Human Behavior (5 ed.). Chicago: University of Chicago Press, 1998
  • Bernstein, Peter, Against the Gods: The Remarkable Story of Risk, John Wiley & Sons, New York, 1996
  • Brest, Paul,“Amos Tversky's contributions to legal scholarship,” Judgment and Decision Making, vol. 1, no. 2, November 2006, pp. 174-178.
  • Devlin, Keith, Goodbye Descartes: The End of Logic and the Search for a New Cosmology of Mind, John Wiley and Sons, 1997
  • Kolata, Gina. 1996. Could It Be? Weather Has Nothing To Do With Your Arthritis Pain? The New York Times, April 3, 1996. Retrieved February 17, 2009.
  • Lowenstein, Roger “Outsider who challenged dismal science,” Wall Street Journal, 6 June 1996, C1
  • Lowenstein, Roger “Sure, markets are rational, just like life,” Wall Street Journal, 13 June 1996, C1
  • McDermott, Rose, "The Psychological Ideas of Amos Tversky and Their Relevance for Political Science," Journal of Theoretical Politics, Vol. 13, No. 1, 5-33, 2001
  • Simon, H. A., Behavioral Model of Rational Choice, The Quarterly Journal of Economics, Vol. 69, No. 1.,1955, pp. 99-118.
  • Tversky, A., D.H. Krantz, R.D. Luce, and P. Suppes, Foundations of measurement, volume 1: Additive and polynomial representations, Academic Press, New York, 1971
  • Tversky, A., D.H. Krantz, R.D. Luce, and P. Suppes, Foundations of measurement, volume 2:

Geometrical, threshold and probabilistic representations, Academic Press, New York, 1989

  • Tversky, A., D.H. Krantz, R.D. Luce, and P. Suppes, Foundations of measurement, volume 3:

Representation, axiomatisation and invariance, Academic Press, New York, 1990

  • Tversky, A. and D.Kahneman, "Subjective probability: A judgment of representativeness," Cognitive Psychology, 3:430–454, 1972; see also Tversky, A. and Kahneman, D., "Extension versus intuititve reasoning: The conjunction fallacy in probability judgment," Psychological Review, 90, 293-315, 1983
  • Tversky, A., and R.H. Thaler,"Preference Reversals" in R. H. Thaler, The Winner's Curse: Paradoxes and Anomalies of Economic Life, Princeton U. Press, pp. 79-91, 1992
  • Tversky, A. < http://www.geom.umn.edu/locate/chance>
  • Tversky, A., Craig R. Fox, "Ambiguity Aversion and Comparative Ignorance," Quarterly Economic Jopurnal, vol. 110, No. pp.585-603, 1995
  • Tversky, A. and Daniel Kahneman, (editors), Choices, Values, and Frames, Cambridge University Press, 2000

External links

Credits

New World Encyclopedia writers and editors rewrote and completed the Wikipedia article in accordance with New World Encyclopedia standards. This article abides by terms of the Creative Commons CC-by-sa 3.0 License (CC-by-sa), which may be used and disseminated with proper attribution. Credit is due under the terms of this license that can reference both the New World Encyclopedia contributors and the selfless volunteer contributors of the Wikimedia Foundation. To cite this article click here for a list of acceptable citing formats.The history of earlier contributions by wikipedians is accessible to researchers here:

The history of this article since it was imported to New World Encyclopedia:

Note: Some restrictions may apply to use of individual images which are separately licensed.