Final Study Guide

NOTE: This covers new material only. Be sure to consult study guides and the tests for Midterms 1 & 2 as well, as there will be some questions on previous material. Expect 10-11 questions from CT, with the rest of the questions from EI and LectureNotes & handouts. There will NOT be any questions from NOM.

From the Critical Thinking textbook:

Chapter 20: Deductive and Inductive Reasoning: Two methods of inference

Deductive reasoning: Drawing an inference from a general premise to a specific conclusion. Beginning with general or universal assumptions that you know to be true, and then arriving at particular conclusions based on these assumptions.

Also known as theory-driven, or top-down processing. Scientists use this type of reasoning when testing theories. This is accomplished by beginning with a proposed explanation for understanding observable phenomena from which specific predictions are derived.

Common mistakes in using deductive reasoning:

* Using valid logic, but starting with an erroneous premise.

* Starting with a valid assumption, but using invalid, or

flawed, logic.

Inductive reasoning: Drawing an inference from specific instances to a general conclusion. Beginning with particular observations and then generalizing to more general principles.

Also known as data-driven, or bottom-up processing

Scientists use this type of reasoning when creating or building

theories. Psychological constructs are also created using inductive

reasoning.

Common mistakes in using deductive reasoning:

*Jumping to a conclusion using insufficient, or

unrepresentative sampling of data.

* Availability Bias, Representativeness Bias, Confirmation Bias, Belief

Perseverance Effect, Dichotomous thinking, and misinterpreting correlation as proving causation.

Chapter 21: Reactivity: To observe is to disturb

Reactivity: The impact that conducting research has on the entity being studied. The extent to which measuring something causes it to change.

This limits the ability of researchers to generalize their

findings to other settings. Thus, it is a threat to a study's

external validity.

This phenomenon is sometimes called the Hawthorne Effect, after the Hawthorne Plant in Chicago, where researchers who were looking at entirely different issues stumbled upon this effect.

Chapter 22: The Self-fulfilling Prophecy: When expectations create reality

Self-fulfilling prophecy: Occurs when people's attitudes, beliefs, or assumptions about another person produce the behaviors that they had initially expected to find.

Rosenthal & Jacobson (1968) conducted a famous study investigating

this phenomenon. In this study, teachers were told that some of their

students would experience an increase in their intellectual

abilities. However, the students who were designated as these

"bloomers" were actually randomly assigned these labels. Later on,

these children's schoolwork improved and IQs increased. Apparently,

the teachers' behavior toward these children created the very outcomes

that they expected.

Chapter 23: The Assimilation Bias: Viewing the world through schema-colored glasses

Assimilation Bias: The tendency to resolve discrepancies between preexisting schemas and new information by assimilating the information to fit the schema.

Schema: A cognitive structure that organizes a person's knowledge,

beliefs, and past experiences. This provides a framework for

understanding new events and experiences.

Schemata allow us to process a large amount of information in a

relatively fast, efficient, and effortless manner.

When we encounter information discrepant from our schemas we can use:

Accommodation: modify our schema to fit the new

information

Assimilation: modify the new information to fit

our schema.

When new information does not fit with our schemas, we typically rely on assimilation rather than accommodation to resolve this discrepancy.

Relying on schemas can lead us to:

* create and maintain incomplete and inaccurate generalizations.

* resist changing beliefs even when faced with contradictory

information.

* distort perception, coding, and storage of information

* commit thinking errors such as the Availability Bias, Belief Perseverance Effect, Confirmation Bias, Hindsight Bias, Self-fulfilling prophecy.

Chapter 24: The Confirmation Bias: Ye shall find only what ye shall seek

Confirmation Bias: The tendency to selectively search for and gather evidence that is consistent with one's preconceptions, prior beliefs, and expectations.

This may occur when an interviewer asks "leading questions."

Interviewers many times find the personality traits that they are

looking for by asking leading questions.

Projective personality testing is an example of confirmation bias.

To ensure avoidance of the confirmation bias, people must be aware

that they are prone to be biased collectors of evidence.

Chapter 25: The Belief Perseverance Effect: The rat is always right

Belief Perseverance Effect: The tendency to cling to one's beliefs, even in the face of contradictory or disconfirming evidence.

We persevere in our beliefs by denying, discounting, or ignoring

information that contradicts our original beliefs.

Even when presented with reliable, scientific evidence that

contradicts our beliefs, we still may discount the evidence. We

may simply distort the new information so that it appears to support our

beliefs.

To correct this bias:

* counterargue your existing beliefs

* consider the opposite

* ask yourself: By changing the external situation, how might the

behavior change?

Chapter 26: The Hindsight Bias: Predicting a winner after the race is finished

Hindsight Bias: The tendency to overestimate what could have been predicted, only after having learned the outcome. The "I-knew-it-all-along" phenomenon.

Events seem much less obvious and predictable beforehand than in

hindsight. However, knowledge of an outcome usually makes this

outcome seem inevitable to us.

Research example: After people know the results of a study, many times

they are inclined to say that they could have predicted these same

results. Thus, they then conclude that the results appear to be

common sense. Most often, this conclusion is due to hindsight bias.

Chapter 27: The Representativeness Bias: Fits and misfits of categorization

Representativeness Bias: A cognitive strategy for quickly estimating the probability that a given instance is a member of a particular category. We use it to judge the likelihood that something or someone belongs to a specific category.

When faced with varying degrees of uncertainty in decision-making,

people often rely on heuristics, or mental shortcuts. These heuristics

reduce the complex tasks to more simple and efficient strategies. They

allow efficient and fast information processing, but may also lead to

imprecise processing.

The representativeness bias and the availability bias are examples of heuristics.

The representativeness bias may be caused when:

* Our initial prototypes or stereotypes are inaccurate or

incomplete.

* We fail to take into account relevant statistical information

(such as base rates, sample size, chance probability).

* We neglect the effects of sample size. (Small sample size

increases the probability that your estimate will be inaccurate.)

* We misunderstand what chance events should look like.

Chapter 28: The Availability Bias: The persuasive power of vivid events

The Availability Bias: A cognitive strategy for quickly estimating the frequency, incidence, or probability of a given event based on the ease with which instances are retrieved from memory.

Vivid, dramatic, important, and personally relevant events create

the most powerful impressions on us. When something is easy to

remember, we are prone to overestimate its occurrence. Additionally,

when instances are quickly retrieved from memory, we tend to

overestimate their frequency.

Unfortunately, the availability bias can lead us to overgeneralize

from a few memorable examples.

Stereotyping example: one negative experience with a member of a minority

group may be easily retrieved from memory. This may lead one to

assume that the negative behavior occurs frequently in this minority

group.

Chapter 29: The Insight Fallacy: To understand something isn't necessarily to change it

The Insight Fallacy: The erroneous belief that understanding the cause of a problem will solve the problem.

Insight into a problem may:

* Provide us with comfort through an understanding of the problem.

* Help us adopt specific problem-solving strategies.

* Give us a meaningful new understanding; thus creating a meaningful

new wholeness in our thoughts, emotions, and actions.

However, just because one understands a problem, this does not

solve it.

Chapter 30: Every Decision Is a Trade-Off: Take stock of pluses and minuses

This chapter discusses the benefits and drawbacks of the meta-thoughts throughout the book. It also covers the arguments for and against the use of diagnostic categories in psychology.

Every decision and action involves benefits and costs. When making a decision, take into account the advantages and disadvantages involved.



Emotional Intelligence book, Chapters 15 & 16

Chapter 15

Based on teachers' assessments, what are the specific ways in which students were doing more poorly in the late 1980s compared to the mid-1970's?

Name 5 perceptual biases, thinking errors, or emotional deficits that characterize aggressive children. Identify two skills that can be taught to these children that help them overcome their problem.

Understand the factors involved in the addiction of young people. What distinguishes between young drinkers who become alcoholics and those who never become addicted? Name two emotional pathways to addiction.

Name two emotional deficits in girls that lead to eating disorders.

Chapter 16

Know the six steps in the Stoplight method for impulse control, and know the four steps in the SOCS problem-solving model.