Chapter 6: Basic Principles of Learning
Instructor's Resource Manual


Chapter Six

chapter 6
basic principles of learning

Chapter Outline 2
Learning Objectives 5
Lecture Enhancers
The Laws of Conditioning 6
Lots of Learning 6
Bear Boys, Swine Girls, Wolf Children 7
Pigeon Overhead: Bombs Away! 8
Punishment 8
The Cat's Out of the Bag...er, Box! 9
Applied Learning 9
Student Assignments
Classical Conditioning and the Pupil Dilation Response 11
Conditioning in Everyday Life 11
Behavior Modification Project 12
Behaviorism in Literature: Walden Two 12
Demonstrations and Activities
Defining Learning 13
Classical Salivary Conditioning 13
Understanding the Elements of Classical Conditioning 14
Shark Attack! 14
Crunch! A Quick Demonstration of Classical Conditioning 15
Operant Conditioning in Human Behavior 15
Using Candy to Illustrate Operant Conditioning Concepts 15
Conditioning a Student "Rat" 16
Reinforcement vs. Punishment 16
Schedules of Reinforcement 17
Human Cognitive Maps 17
Video 18
Transparencies 24
Handout
Defining Learning 24
Elements of Classical Conditioning 25
Reinforcement versus Punishment 26
Schedules of Reinforcement 27


chapter outline

I. What Is Learning?

II. Classical Conditioning

III. Operant Conditioning

IV. Extinction

V. Observational Learning

VI. Behavior Modification


learning objectives

Students should be able to:

  1. Define learning and distinguish it from automatic responses and changes due to maturation.
  2. Define classical conditioning and explain how the various components of classical conditioning are related.
  3. Describe how phobias are acquired and eliminated using the principles of classical conditioning.
  4. Explain the significance of Watson and Rayner's Little Albert experiment and discuss pleasant unconditioned stimuli.
  5. Describe the following acquisition sequences and their effects: trace conditioning, delayed conditioning, simultaneous conditioning, and backward conditioning.
  6. Define and give examples of extinction, spontaneous recovery, generalization, and discrimination.
  7. Explain the significance of contingency theory and blocking.
  8. Explain the concept of taste-aversion learning and its relation to biological preparedness.
  9. Define operant conditioning and distinguish between primary and secondary reinforcement and between positive and negative reinforcement.
  10. Describe the function of a Skinner Box and define shaping.
  11. List and describe the various types of reinforcement schedules and their effect on behavior/response rates.
  12. Distinguish between insight learning, latent learning, and serial enumeration.
  13. Define punishment and summarize the six guidelines for effective punishment.
  14. Explain the partial-reinforcement effect and the meaning of stimulus control.
  15. Define observational learning and summarize the conditions that must be present for it to occur.
  16. Define behavior modification and describe situations in which it is effectively used.


lecture enhancers

The Laws of Conditioning

Students of learning theory are familiar with Edward Thorndike's law of effect. Stating, roughly, that behaviors which are followed by reinforcement tend to be repeated again in the future, the law of effect forms the cornerstone of operant conditioning. Less well-known, however, are other laws that Thorndike formulated, or the revisions that each underwent.

For example, the law of effect as we know it today was originally stated somewhat differently. Prior to 1930, Thorndike emphasized the connection between a stimulus and response, and how the connection could be strengthened or weakened depending on the consequences of the response. His original formulation, then, stated that if a response was followed by satisfaction (i.e., the maintenance of some state that is agreeable to the organism) the strength of the connection would be increased. If the response was followed by an annoyance (i.e., a situation the organism seeks to avoid), the connection between stimulus and response would be weakened. After 1930, Thorndike revised the law of effect after realizing that, in effect, the effect was incorrect. Put simply, it's only half true: Reinforcement increases the strength of a connection, whereas punishment does little to the strength of a connection.

Similarly, Thorndike's law of readiness and law of exercise underwent revision. The law of readiness states that it is satisfying to complete an act once one has prepared to do so, whereas it is frustrating to not be able to perform an act or to be forced to perform when one does not want to. In short, the law of readiness states that interference with goal-directed behavior is aversive, a point that's difficult to argue with. The law of exercise, on the other hand, can be quibbled with; in fact, Thorndike did so himself. Prior to 1930, Thorndike held that connections between a stimulus and response are strengthened as they are used (the law of use) and weakened if they are not (the law of disuse). In other words, exercising or discontinuing a stimulus-response connection can respectively strengthen or weaken the connection. Thorndike later abandoned the law of exercise entirely.

Hergenhahn, B. R., & Olson, M. H. (1993). An introduction to theories of learning (4th ed.). Englewood Cliffs, NJ: Prentice Hall.

Lots of Learning

Learning theories are typically divided into classical and operant conditioning, cognitive learning, observational learning, and a handful of miscellaneous approaches. Edward Tolman alone, however, perhaps true to his iconoclastic ways, proposed six kinds of learning:

In many ways, Tolman's statement of these types of learning reflects his attempt to consolidate the best ideas from Clark Hull, Edwin Guthrie, Gestalt theories, operant conditioning, and his own views into a single system. Aspects of those various schools of thought are either evident or implied in many of the types of learning Tolman proposed.

Hergenhahn, B. R., & Olson, M. H. (1993). An introduction to theories of learning (4th ed.). Englewood Cliffs, NJ: Prentice Hall.

Tolman, E. C. (1949). There is more than one kind of learning. Psychological Review, 56, 144-155.

Bear Boys, Swine Girls, Wolf Children

Cases of feral children can be traced back for centuries, at least as far as the celebrated case of Romulus and Remus. The term feral, in its modern usage, refers to a number of situations: Human children raised by animals; children surviving in the wilderness; children raised in isolated confinement; or children raised in confinement with little human contact. Regardless of the circumstances, children reared under atypical conditions present a unique case of learning.

Carlos Linnaeus first documented cases of feral children based largely on anecdotal evidence. Colorful figures such as the Hessian wolf-boy (1344), Lithuanian bear-boy (1661), or Irish sheep-boy (1672) covered both a lot of terrain and much of the animal kingdom, and provided ammunition for thinkers from Jean-Jacques Rousseau to Francis Gall about the contributions of nature and nurture to human development. Other notable cases include the Wild Boy of Aveyron, Kaspar Hauser, and Wild Peter. It wasn't until the well-known case of the wolf-children of Midnapore, Kamala and Amala, that structured psychological study of feral children began. Captured in 1920 by Reverend J. A. L. Singh and a hunting party, Kamala (approximately age 8) and Amala (perhaps 1_) were seen in the presence of a wolf mother and three cubs, and were taken as they tried to leave their den. Amala died after a year apart from the wolves, although Kamala lived until about the age of sixteen. During her eight years living with humans she was able to understand speech, mastered a vocabulary of 45 words, and could form 2- and 3-word sentences, in addition to a wide repertoire of gestural communication. These developments did not take place overnight. For several years Kamala was frightened of humans (uttering shrieks and cries when they came near) and was largely mute. Through therapeutic massage and dedicated attention by Mrs. Singh, Kamala grew to become an active, affectionate member of the Singhs' orphanage.

The case of Anna provides a contrast to Kamala. Born in 1932, Anna was the second illegitimate child of a rural woman. Unwanted by her parents (they tried unsuccessfully to place her for adoption) and hated by her grandfather, Anna was kept locked from sight in the attic of the family home. For 6 years she lived with minimal human contact and subsisted on a diet of cow's milk. She could neither walk nor talk and was malnourished. After a few years of institutional care Anna was able to speak in phrases and short sentences, although her abilities remained in the retarded range. She died August 6, 1942, at the age of ten.

The challenge to learning is clear in these examples. In many cases feral children are quadruped and in most cases they lack speech. Hence, the challenge of restoring some aspects of behavior (e.g., walking upright, not eating from the floor) compound the normal challenges of learning (e.g., speech training, interpersonal skills training). In some cases, previous responses must be replaced by new ones, such as Kamala's learning to eat from a table, whereas in other cases existing cognitive processes need to be modified, such as Kaspar Hauser's rudimentary speech. In all cases feral children provide food for thought about a variety of issues related to learning, development, and cognition.

Candland, D. K. (1993). Feral children and clever animals: Reflections on human nature. Oxford: Oxford University Press.

Davis, K. (1940). Extreme social isolation of a child. American Journal of Sociology, 45, 554-565.

Gesell, A. (1940). Wolf child and human child: The life history of Kamala, the wolf girl. New York: Harper.

Linnaeus, C. (1758). Systema Naturae (10th ed.).

McNeil, M. C., Polloway, E. A., & Smith, J. D. (1984, February). Feral and isolated children: Historical review and analysis. Education and Training of the Mentally Retarded, pp. 70-79.

Singh, J. A. L., & Zingg, R. M. (1941). Wolf-children and feral man. New York: Harper & Bros.

Zingg, R. M. (1940). Feral man and extreme cases of isolation. American Journal of Psychology, 53, 487-517.

Pigeon Overhead: Bombs Away!

Animals have consistently played a prominent role in learning and conditioning experiments, from Edward Thorndike's cats to Edward Tolman's rats to the disobedient menagerie of Marian and Keller Breland. Included in this list are some very famous pigeons who almost helped the national defense.

B. F. Skinner worked at the University of Minnesota during the second World War. Interested in applying the principles of operant conditioning to the war effort, Skinner trained pigeons to peck at discs which had moving pictures of enemy targets displayed on them. The pecking served to close electronic circuits, which in turn formed a self-regulating system. Although this is no great feat in itself - these actions faithfully follow the most basic rules of operant conditioning - Skinner's vision was to install his pigeons, discs, and circuits in gliders packed with explosives. The idea was to have the pigeons peck on cue to manipulate the circuits, which in turn would keep the glider on its kamikaze course toward an enemy target. A neat, tidy bombing run, with no loss of human life.

The Defense Department declined Skinner's help, even though he demonstrated to top scientists that the homing device withstood electronic jamming, the apparatus was inexpensive to build, and the basic set-up could be applied to a range of enemy targets. In the present era of Star Wars weaponry, stealth bombers, and combat guided by virtual reality, perhaps a pigeon bombardier wouldn't seem so far-fetched.

Hergenhahn, B. R., & Olson, M. H. (1993). An introduction to theories of learning (4th ed.). Englewood Cliffs, NJ: Prentice Hall.

Skinner, B. F. (1960). Pigeons in a pelican. American Psychologist, 15, 28-37.

Punishment

Students often have difficulty distinguishing between negative reinforcement and punishment. These examples of types of punishment may clarify what it is and when it should be used.

Physical punishment or aversive punishment involves administering a stimulus that evokes discomfort. Spankings, electric shock, harsh sounds, or pinches would be included in this category. Aversive punishment is typically used in extreme cases, as it is neither pleasant to administer nor to receive. Reprimands are strong verbal commands ("No!" "Stop that!" "Bad!") used when an inappropriate behavior is displayed. They are sometimes accompanied by physical or nonverbal reprimands. Timeout can be exclusionary or nonexclusionary. Exclusionary timeout involves removing an individual for a short time from a situation that he or she finds reinforcing. Nonexclusionary timeout involves introducing a stimulus that is less reinforcing. For example, children might be given a "good conduct" badge to wear while playing in a classroom. If the child becomes disruptive the badge will be removed, and the child will be ignored by the teacher and not allowed to play with the others. Finally, response cost involves removing a specified amount of reinforcement after an undesired behavior occurs. Parking tickets, bank fees, or library fines would be examples of this type of punishment.

As the text mentions, to be effective, punishment must be swift, certain, and sufficient. Some guidelines for deciding to use punishment include selecting a specific response to punish (such as spitting out food) rather than a general category of behavior (such as not eating or being finicky); maximizing the conditions for a desirable alternative response and minimizing the conditions for the causes of the undesirable response; selecting an effective punisher (i.e., one that can be delivered immediately and will not be associated with subsequent positive reinforcement).

Martin, G., & Pear, J. (1992). Behavior modification: What it is and how to do it (4th ed.). Englewood Cliffs, NJ: Prentice Hall.

The Cat's Out of the Bag!...er, Box!

Edwin Guthrie is chiefly known for one idea in Behaviorism; the principle of one-trial learning. Guthrie held that learning was complete - that is, an association between a stimulus and a response was at its strongest - after only one pairing of the stimulus and response.

The way he set about testing his idea was to use a variant of Thorndike's puzzle box. Guthrie modified the box by placing a long, thin rod vertically in it, wired so that each time a cat rubbed against it the door to the box would spring open, allowing the animal to exit. Guthrie noted that among some 800 cats each had a stereotyped way of rubbing the rod, which was repeated trial after trial, even in absence of reinforcement. He took this as evidence for one-trial learning; the response was full-blown from the first trial, and it was not modified over trials.

Being a good Behaviorist, Guthrie made careful observations of the laboratory animals. Being a good Behaviorist, Guthrie stuck to fairly straightforward, objective testing conditions. But being a good Behaviorist, Guthrie assumed that species-specific behavior would not play a major role in the experiment's outcomes. Like Clark Hull, for example, Guthrie was interested in demonstrating a principle of learning, regardless of whether it was demonstrated by a cat, rat, chimpanzee, or human. Unfortunately, cats exhibit a stereotyped greeting response when in the presence of a conspecific (which, for most domestic cats, includes humans). That is, they rub against their fellow cat as it passes by or, in the case of greater distances, they rub against a more convenient object, such as a tree, furniture, or Uncle Harry's leg. As Guthrie and his laboratory assistants observed the cats, then, it is not remarkable that they all showed highly stereotyped behavior; they did what cats do.

Bruce Moore and Susan Studdard illustrated this point in a simple experiment. Cats were placed in puzzle boxes that had long, thin, vertical rods, but this time rubbing the rods triggered no doors. Moore and Studdard also varied whether a person was present or not as the cats meandered through the box. They discovered, quite simply, that when a person was present the bar was rubbed, and when a person was not present, the bar was not rubbed. As Guthrie observed, the rubbing itself was quite stereotyped, befitting an innate feline response.

Guthrie, E. R., & Horton, G. P. (1946). Cats in a puzzle box. New York: Rinehart.

Leahey, T. H., & Harris, R. J. (1993). Learning and cognition (3rd. ed.). Englewood Cliffs, NJ: Prentice Hall.

Moore, B., & Studdard, S. (1979). Professor Guthrie and felis domesticus, or: Tripping over the cat. Science, 205, 1031-1033.

Applied Learning

Behavior modification can be thought of as a technology that developed out of learning theory. Based on the principles of operant conditioning, behavior modification seeks to structure the reinforcement a person receives for his or her actions in order to modify or shape more productive behavior. There are several areas of application, as noted by Garry Martin and Joseph Pear.

Martin, G., & Pear, J. (1992). Behavior modification: What it is and how to do it (4th ed.). Englewood Cliffs, NJ: Prentice Hall.


student assignments

Classical Conditioning and the Pupil Dilation Response

Roger Hock suggests a simple classical conditioning experiment that students can perform on themselves at home. Students will need a bell, a hand-held mirror, and a room that becomes completely dark when the light is turned off. Instruct students to hold the bell while standing in the room near the light switch. Once in position, they should ring the bell and then immediately turn off the light. After waiting in total darkness for about 15 seconds, they should turn the light back on. They should wait another 15 seconds with the light on, and then ring the bell and immediately turn the light back off (again waiting 15 seconds in the dark). Students should repeat this procedure 20 to 30 times, making sure that in each case the bell is rung immediately before the light is turned off. After numerous pairings, students are ready to see the results. With the light on, they should watch their eyes closely in the mirror and then ring the bell. Students' pupils should dilate slightly even without a change in light!

For a simple out-of-class assignment, ask students to perform this demonstration at home and to report on their results as part of a class discussion. For a more elaborate assignment, ask students to write a 1 to 2-page paper explaining the process in terms of classical conditioning. Students should explain that because pupils naturally dilate and constrict according to the amount of light intensity, the darkness in this study is an unconditioned stimulus (US) that leads to the unconditioned response (UR) of pupil dilation. By repeatedly pairing a neutral stimulus (e.g., the bell) with the unconditioned stimulus (i.e., darkness), the bell has become a conditioned stimulus (CS) that elicits the conditioned response (CR) of pupil dilation. As part of their paper, students should also propose another (i.e., original) classical conditioning experiment that they can perform at home. Have students perform their experiment, and then in their papers they should carefully describe the procedure and results in classical conditioning terms. An added benefit of this assignment is that many of the clever ideas generated by students can be used as out-of-class demonstrations for future classes!

Hock, R. R. (1992). Forty studies that changed psychology: Explorations into the history of psychological research. Englewood Cliffs, NJ: Prentice Hall.

Conditioning in Everyday Life

After having read the textbook, after having been exposed to lecture, and after having completed some of the practice exercises suggested here, students should be well-versed in the theories and principles behind classical and operant conditioning. Students should now be ready to apply what they've learned to everyday life. Ask students to write a 2 to 3-page paper discussing practical extensions of classical conditioning, operant conditioning, or both. There are several variations to this assignment. In one version, students can discuss a variety of examples of conditioning from their own personal experience. A student might, for instance, retell the story of a recent taste aversion experience, note that a pet cat is conditioned to respond to the sound of a can opener, or describe how they reinforce their own study habits. In another version, students can be asked to find examples of how businesses cleverly use these principles in an attempt to influence consumers. Potential examples include magazine or newspaper ads that associate a product with a stimulus that produces positive feelings, letters from polling organizations that include an incentive (such as a crisp dollar bill) for completing a questionnaire, "gifts" (such as personalized address labels) from non-profit charitable organizations seeking donations, or any of a new crop of "personalized" appeals that get our attention because they use cues associated with friendship rather than outright sales pitches (e.g., a handwritten message on a yellow sticky note attached to what appears to be a vacation postcard from a friend). In still another version of this assignment, students can be asked to locate two or three reports of recent research that apply principles of classical or operant conditioning to real-world problems (e.g., classical conditioning as a treatment for bedwetting, incentives for participating in curbside recycling programs, the revoking by states of teenage drivers' licenses for school truancy). For any and all versions of this assignment (combinations are possible as well), students' papers should contain a thorough discussion that relates their examples to theory and principles discussed in text and lecture.

Behavior Modification Project

An excellent way for students to gain a greater appreciation for learning concepts is to put what they've learned to practical use. For this assignment, ask students to apply principles of operant conditioning to modify an existing behavior. Instruct students to identify a target behavior to be modified, either an undesirable behavior that they would like to eliminate or a desirable behavior that they would like to strengthen. By taking a close look at many aspects of their own behavior--such as study habits, sporting skills, health habits, or personal-interaction skills--students should have no trouble selecting a behavior they'd like to change. Examples of potential undesirable behaviors to eliminate include smoking cigarettes, eating fatty foods, watching too much TV, speeding, phobias or anxieties (e.g., fear of flying, test anxiety), and procrastination before exams or papers. Examples of desirable behaviors to be increased include remembering people's names, becoming more punctual with respect to class or social events, outlining textbook chapters while reading, using a turn signal while driving, and increasing a skill in sports (e.g., using a left foot in soccer, increasing free-throw percentage in basketball).

For this assignment, students should propose a program for changing a behavior and then later, after implementing their program, report on its results. This probably works best as a two-part assignment (to ensure that students' programs are carefully thought out and so they don't fudge on their criteria for, or evaluation of, success), but should also work fine as a single final report. After identifying the target behavior, students should monitor their behavior for a few days and try to generate a plausible explanation for why the problem exists. They should also describe why they want to change the behavior and what benefits change will bring. Next, students should carefully design a program for modifying the behavior. In their program proposal, students should describe all relevant conditioning principles incorporated within their plan, which might include the use of positive and negative reinforcers, punishment, shaping, schedules of reinforcement, modeling, extinction, stimulus discrimination or generalization, primary and secondary reinforcers, and so on. Students should then implement their program and write up an honest report of the results. To what degree was the program successful? Plausible explanations for success or failure should be highlighted. If students failed, they should propose (but not carry out) an alternative plan that might be more successful in the future. If students succeeded, they should propose a plan that to help them maintain the change. In fairness and to encourage students to select important (but perhaps difficult) target behaviors, students should be graded on their understanding and application of learning principles (e.g., as evident in their program design and implementation) rather than their degree of success at modifying the chosen behavior.

Behaviorism in Literature: Walden Two

In Walden Two, Burrhus Skinner describes a utopian community in which major social problems have been eliminated through the use of operant conditioning. This ideal community is free of racism, crime, poverty, and laziness, all owing to its strict adherence to reinforcement principles as a means of governing behavior. Ask your students to read Walden Two and then to write an essay applying the behavioral principles from the text and lecture to the novel. Their discussion should also include a critical evaluation of the community presented in the novel. Would such an approach be feasible? What are the advantages and disadvantages to this system? Would they personally like to live there? Why or why not?

Alternatively, Michael Gorman and his colleagues have suggested contrasting Walden Two, which emphasizes the environment as the sole determinant of behavior, with The Eden Express, which explores biological roots of behavior (in this case, schizophrenia).

Gorman, M. E., Law, A., & Lindegren, T. (1981). Making students take a stand: Active learning in introductory psychology. Teaching of Psychology, 8, 164-166.

Skinner, B. F. (1948). Walden Two. New York: Macmillan.


demonstrations and activities

Defining Learning

Rather than delving immediately into the principles of classical and operant conditioning, consider introducing the topic of learning by devoting class time to a discussion of its definition. Although the psychological definition of learning is a fairly straightforward one, students may initially have trouble with the concept because of their intuitive notion that "learning" is synonymous with "studying" and also because they have difficulty distinguishing behaviors that are truly learned from those that can be attributed to other factors such as instinct or maturation.

Thomas Rocklin suggests an engaging activity that can be used to help your students explore the concept of learning. Handout 6-1 contains a list of events compiled by Rocklin that potentially represent examples of learning. Duplicate and distribute this list (or read it aloud) and ask students to indicate which events are examples of learning and which are not. As students defend their choices, their own intuitive definitions of learning should become evident. During the discussion, be sure to compare and contrast their ideas about learning with the definition presented in the text (i.e., that learning is "the process by which experience or practice results in a relatively permanent change in behavior or potential behavior"). Rocklin notes that although the majority of events yield fairly consistent answers, items related to computers typically generate disagreement and controversy. In addition to enjoying the active participation encouraged by this exercise, students should also come away with a more thorough understanding of the concept of learning.

Rocklin, T. (1987). Defining learning: Two classroom activities. Teaching of Psychology, 14, 228-229.

Classical Salivary Conditioning

Dennis and Rosemary Cogan have designed a relatively quick but powerful demonstration of classical conditioning. You'll need a can of sweetened lemonade powder and enough small Dixie-type cups so that each participating student has one. After discussing Pavlov's work, distribute to each student a cup approximately half-filled with lemonade powder. After deciding on a neutral stimulus to serve as the conditioned stimulus (the Cogans suggest "Pavlov"), you are ready to begin conditioning. Instruct students to moisten the tip of their index finger and then to dip it into the powder and then onto their tongues whenever you give a prearranged signal (such as raising your arm). Also inform students that you will occasionally say the words "test trial" instead of giving the signal; when this occurs, students should refrain from tasting the powder and instead close their eyes and concentrate on their own experience.

Present the CS (i.e., say "Pavlov") and then after a delay of .5 to 1.5 seconds, give the signal for students to taste the lemonade powder (i.e., raise your arm). These learning trials should be repeated every 10 to 15 seconds, with test trials (in which you say, "Pavlov...test trial") occurring after every 10 learning trials. After each test trial, ask for a show of hands for those who are salivating (the majority of the students should be salivating by the 7th or 8th trial). When most students show evidence of conditioning, demonstrate extinction by continuously giving test trials (i.e., saying "Pavlov...test trial" over and over) until students no longer salivate. During the next class session, demonstrate spontaneous recovery by saying the word "Pavlov" and asking for a show of hands for those who salivate. The Cogans report that in addition to being enthusiastically received by students, this demonstration facilitates understanding of conditioning principles and generates a discussion of classical conditioning applications to real life problems.

Cogan, D., & Cogan, R. (1984). Classical salivary conditioning: An easy demonstration. Teaching of Psychology, 11, 170-171.

Understanding the Elements of Classical Conditioning

Although students have no problem understanding classical conditioning intuitively, they often become confused by the terminology and have difficulty keeping straight the four elements of classical conditioning: unconditioned stimulus (US), unconditioned response (UR), conditioned stimulus (CS), and conditioned response (CR). After lecturing on classical conditioning, consider giving your students extra practice applying these principles by going over Handout 6-2 in class. Correct answers are given below.

US UR CS CR
Scenario 1 poison dizziness &
nausea (aversion)
sheep running away (aversion)
Scenario 2 immune suppressing drug weakened
immune
response
saccahrine flavored
water
weakened immune response
Scenario 3 job interview anxiety/
nervousness
airplane/
flying
anxiety/
fear
Scenario 4 bad weather unhappiness weathercaster unhappiness
Scenario 5 attractive women desire automobile desire

Ader, R., & Cohen, N. (1985). CNS-immune system interactions: Conditioning phenomena. Behavioral and Brain Sciences, 8, 379-94. [Example 2]

Cialdini, R. B. (1993). Influence: Science and practice (3rd ed.). New York: HarperCollins. [Examples 4 & 5]

Garcia, J., Hawkins, W, & Rusniak (1974). Coyote predation control by aversive conditioning. Science, 184, 581-83. [Example 1]

Shark Attack!

Randy Smith suggests an engaging demonstration that provides a nice, real-world introduction to the concepts in classical conditioning. Before your lecture on classical conditioning, obtain a tape recording of several seconds of the shark attack music from the movie Jaws. Then, begin your lecture by asking students to relax, close their eyes, and imagine the following scene. Tell them to imagine that they are at the beach on a clear, sunny, scorching day. They are getting hotter by the minute, and when they can stand it no longer, they run toward the cool, refreshing ocean, and splash around in the shallow water. Then, they swim out into deeper water and are really enjoying cooling off after being in the sun. (Smith suggests really playing this up and stretching it out for 2 or 3 minutes.) Then, as unobtrusively as possible, start the music from Jaws. As students' expressions change dramatically and laughter erupts, it becomes clear that most have seen Jaws and have been conditioned to associate the killer shark with the famous music. Turn the music off and ask students to explain in their own words why they responded as they did. They should be able to explain that the ferocious shark (US) in the movie naturally elicited a strong fear response (UR), and that because the music always signaled an impending shark attack, the music (CS) by itself came to elicit a fear response (CR). Smith reports that students should be quite prepared to understand your discussion of Pavlov's experiment and classical conditioning and to see the application of classical conditioning to human behavior.

Smith, R. A. (1987). Jaws: Demonstrating classical conditioning. In V. P. Makosky, L. G. Whittemore, & A. M. Rogers (Eds.), Activities Handbook for the teaching of psychology: Vol. 2 (pp. 65-66). Washington, DC: American Psychological Association.

Crunch! A Quick Demonstration of Classical Conditioning

Vandendorpe described a quick demonstration designed to illustrate a conditioned emotional response to the word "crunch." Start the lecture by telling students that the experimental psychology class is studying cognitive associations to various words, and has asked you to obtain some data by getting associations to the word "crunch." Then ask your class to provide some associations that occur to them when they hear the word "crunch" and record them. Continue with your lecture and when you begin discussing phobias, use the narrative developed by Vandendorpe as an example:

Of course, I don't actually have a phobia, but I do have a rather intense dislike that will show you what I mean by a conditioned emotional response. Now, I really don't mind most insects, like ants and tomato bugs. I even think spiders are fine, although most people don't care for them. But what I really can't stand...what really gets my skin crawling are certain kinds of bugs. You know the ones I mean...they creep around at night, when you can't see them, and they like dark places, and hide in sewers. I'm talking about those black water bugs, and cockroaches, too. I really dislike them. Now I understand that they don't carry disease, so that if a cockroach walked over your dinner plate, it wouldn't be that bad a thing, but I still don't like them. And the thing I really hate about them is that they've got these hard shells, so that if you step on them, they go "crunch" (when you say crunch, do so with some dramatic flair and emphasis).

Vandendorpe noted that this demonstration almost never fails to elicit groaning or physical discomfort to the word "crunch." After producing the conditioned emotional response, go back over the original associations produced to the word and discuss why the word did not initially produce the conditioned emotional response and how the response might have been established.

Vandendorpe, M. M. (1988, October). Crunch: Demonstrating the conditioned emotional response. Paper presented at the Mid-American Conference for Teachers of Psychology, Evansville, IN.

Reprinted from Hill, W. G. (1995). Instructor's resource manual for Psychology by S. F. Davis and J. J. Palladino. Englewood Cliffs, NJ: Prentice Hall.

Operant Conditioning in Human Behavior

Because much of the operant conditioning research presented in textbooks is conducted with animals (e.g., rats, pigeons, dogs), students sometimes have difficulty seeing its relevance to human behavior, which presumably is not as susceptible to environmental control. Edward Stork suggests a simple demonstration that can be used to generate a discussion of human operant conditioning. While discussing operant conditioning, interrupt your lecture with a question that you know will elicit a mostly positive response (e.g., "How many of you are planning to major in psychology?"; "How many of you live within 5 miles of campus?"; "How many of you plan to register for classes next term?"). Most students will raise their hands in response to your question. Tell them to hold that position, and ask if anyone told them to raise their hands or even mentioned raising hands. After the chorus of groans (from "being caught") dies down, ask students to explain their behavior in terms of operant conditioning, and then use this activity as a springboard for generating other examples of operant conditioning in humans.

Stork, E. (1981). Operant conditioning: Role in human behavior. In L. T. Benjamin, Jr., & K. D. Lowman, (Eds.), Activities handbook for the teaching of psychology (p. 57). Washington, DC: American Psychological Association.

Using Candy to Illustrate Operant Conditioning Concepts

After having tricked your students into displaying evidence of operant conditioning in the previous exercise, reward them with this simple (and tasty!) demonstration that uses a candy machine to illustrate operant conditioning concepts. During your lecture on operant conditioning, place a filled candy machine (e.g., containing M&Ms or peanuts) on a table at the front of the room. (A bubble gum machine can be substituted, but because it requires pennies it can be a little more cumbersome to use). Invite any and all interested students to come inspect the machine and do whatever they want to with it (most will, of course, pull the lever and be rewarded with candy). While students are engaging in this activity, ask them to relate any behaviors they observe to material from the text. You can also prompt them with questions to help them understand additional terms and concepts, such as: (1) What would happen to your behavior if all the candy were gone? (extinction), (2) What if the machine were refilled? (spontaneous recovery), (3) What if there was an empty coffee jar next to the candy machine? (discrimination), (4) What if there was a similar machine (such as a gumball machine), filled, but not exactly like the candy machine? (generalization), (5) What if you do not like candy or cannot eat it for health reasons? (effectiveness of a reinforcer, motivation), (6) What if the machine were filled with money instead of candy? (secondary reinforcer), (7) What if, like a slot machine, money only appeared after a random number of pulls of the lever? (variable ratio schedule), (8) What might happen to your behavior if you are reinforced on a variable ratio schedule instead of a continuous one? (superstitious behavior, extinction would take longer), (9) What would happen if very bad tasting candy came out of the machine? (punishment). During the course of this exercise, it is likely that students will come up with additional questions or interesting variations of their own.

Adapted from Smith, J. Y. (1990). Demonstration of learning techniques. In V. P. Makosky, C. C. Sileo, L. G. Whittemore, C. P. Landry, & M. L. Skutley (Eds.), Activities handbook for the teaching of psychology: Vol. 3 (pp. 83-84). Washington, DC: American Psychological Association.

Conditioning a Student "Rat"

A fun demonstration of shaping the behavior of a student "rat" can be used to liven up your coverage of operant conditioning. (This exercise should be done after students are familiar with the concept of shaping by successive approximations.) Ask for a student volunteer to be the rat and send that person outside the classroom. In the meantime, the class should select a target behavior for the rat to perform. Potential target behaviors include turning off the classroom lights, turning on the overhead projector, picking up chalk or an eraser, scratching his or her head, shaking hands with the instructor, and so on. The class should also select its method of reinforcement; smiles (which can be big or small), nods (which can be slight or vigorous), or even pencil tappings should work well. When the "rat" returns, the class should reinforce successive approximations of the goal behavior. That is, they should reinforce the rat when it is close to performing the behavior, and do nothing when it is far from the desired behavior. Alternatively, students can adopt the popular "temperature" version from the childhood game and use "cold", "cool", "warm", "hot" and so on to reflect closeness to the goal.

This exercise can also be modified to include more student rats as part of a small-group activity. Depending on your class size, divide students into groups of about 4-6 people and ask for a volunteer "rat" from each group. To keep things flowing smoothly, you should distribute to each group a slip of paper containing the target behavior. These should be simple, personal behaviors (such as having rats cover their eyes, clap their hands, stand on one leg, make the "okay" gesture, say a target word, take off a watch, etc.) rather than more expansive behaviors (such as going to a corner of the room) in order to keep the activity coordinated and manageable. If you have time, you may want to let each student have a turn playing the rat.

Reinforcement vs. Punishment

Although reinforcement (which serves to increase or strengthen a behavioral response) is conceptually the opposite of punishment (which serves to decrease or weaken a behavioral response), students often have a hard time distinguishing negative reinforcement from punishment. Handout 6-3 contains several realistic examples of behavior that can be classified as positive reinforcement, negative reinforcement, or punishment. After you have discussed these principles in lecture, test your students' ability to apply what they've learned by going over this short exercise in class. Correct answers are given below.

1. PR

2. PUN

3. PUN

4. NR

5. PR

6. PR

7. NR

8. PUN

9. PUN

10. NR

11. NR

12. PR

13. PUN

14. NR

15. PR

16. PR

17. PUN

18. PR

19. PUN

20. PR

Schedules of Reinforcement

After you have lectured on reinforcement schedules, test students' ability to apply Fixed Interval (FI), Fixed Ratio (FR), Variable Interval (VI), and Variable Ratio (VR) schedules to everyday behavior. Handout 6-4 contains many real-world examples of these schedules and can be duplicated and distributed to students or given orally if your copying budget is tight. Correct answers follow.

1. FI

2. VI

3. VR

4. VI

5. FR

6. VR

7. FI

8. FR

9. VI

10. VR

11. FR

12. FR

13. FI

14. VR

15. VI

16. FI

17. VI

18. FI

19. VR

20. FR

Examples selected from a compilation by Roig, M., and Greco-Vigorito, C. (1993). Catalog of negative reinforcement and intermittent reinforcement schedule examples. Paper presented at the 101st Annual Convention of the American Psychological Association, Toronto.

Human Cognitive Maps

The text describes Tolman's famous study, which demonstrated that latent learning (i.e., learning not immediately reflected in behavior change) can occur in rats. His study showed that rats left alone in a maze, even when not reinforced for their behavior, formed "cognitive maps" (mental representations of the maze) that allowed them to run the maze swiftly and accurately when later reinforced with food. Humans, too, use a variety of cognitive maps to represent important environments, such as a school, a stadium, an airport, a freeway system, a parking lot, a shopping mall, and so on. You can illustrate this phenomenon easily with students by having them generate a cognitive map for a relevant environment, such as your college campus. After discussing Tolman's study, ask students to take out a blank sheet of paper and sketch a map of the campus. Compare and contrast the features included in (and excluded from) students' maps by either projecting an actual campus map on the overhead or by having a few students with detailed maps sketch their maps on the board. Discuss how and why students' maps differ. What landmarks were central to most maps? Were any features commonly left out? Were there differences between students who live on campus and those who commute? Were there differences related to major of study? Perhaps students' maps were more detailed and/or accurate for areas near their major department. Were there noticeable differences between maps drawn by seniors (who have spent considerably more time on campus) and first-year students? Were there any gender differences? Were there differences between athletes and nonathletes? These and other questions should spark an interesting discussion on the role of experience in developing cognitive maps.

Whitford, F. W. (1995). Instructor's resource manual for Psychology: An Introduction by C. G. Morris (8th ed.). Englewood Cliffs, NJ: Prentice Hall.


video

ABC News/Prentice Hall Video Library

Building Brains (12:41 min, Series III). The developmental changes that take place during the first three years of life are remarkable, particularly those changes that occur in the brain. This Nightline segment focuses on the importance of early stimulation, training, and education for "building" better brains.

Contract to Get Parents Involved in a Child's Education (3:47 min, Series III). ABC's World News Tonight looks at a program in Stone Mountain, Georgia, that encourages parents to actively become involved in their children's education. Parents of students at Pine Ridge Elementary sign contracts agreeing to attend parent-teacher conferences, spend a minimum of 15 minutes reinforcing each day's lessons, and volunteer at the school.

Other Sources

Apes and Language (1981, 29 min, IM). The efforts of Duane Rumbaugh, Roger Fouts, and others are examined in this overview of ape-human communication.

Ape Language: From Conditioned Response to Symbol (1986, 96 min, IM). Symbolic and syntactic skills of chimpanzees are considered.

Behavior Modification: Teaching Language to Psychotic Children (1969, 42 min, PH). O. Ivar Lovaas works with autistic children. A somewhat rare film, but worth viewing.

B. F. Skinner (Parts I and II, 1966, 50 min each, IM). In Part I of this set, Skinner discusses his views of Freud, teaching machines, motivation, and reinforcement. Part II focuses on Walden Two and behavior change in American society.

B. F. Skinner and Behavior Change (45 min, PENN). Traces the development of modern behaviorism and examines its applications in a variety of settings, as well as its ethical and social implications.

B. F. Skinner on Behaviorism (1977, 28 min, IM). Skinner discusses applications of behaviorism such as programmed instruction and behavior modification.

Dr. B. F. Skinner (1966, Parts 1 & 2, 50 min each, b&w, PENN). An intriguing interview with one of the major forces in psychology. In Part 1, Skinner talks at length about Freud and provides an enlightening comparison between the behaviorist and psychoanalytic schools. The main issues on the second tape are Skinner's novel "Walden Two" and his evaluation of the American educational system.

Biological Preconditions of Learning (28 min, FHS). How the senses, nervous system, and brain interact to produce learning. Changes in neural connections during learning are discussed.

Business, Behaviorism, and the Bottom Line (1972, 22 min, CRM). Skinner discusses Behaviorism in the workplace. Executives from Emery Air Freight discuss how behavior modification saved them $2 million in three years.

Classical and Instrumental Conditioning (1978, 20 min, HRM). The two approaches to conditioning are discussed and compared.

Classical and Operant Conditioning (56 min, FHS). This broad overview of Behaviorism presents terminology, applications, and debates surrounding these two types of learning.

A Conversation with B. F. Skinner (1972, 23 min, FLI). Skinner discusses Beyond Freedom and Dignity and some of the reactions it inspired.

A Demonstration of Behavioral Processes by B. F. Skinner (1975, 28 min, IM). Skinner makes a superstitious pigeon right before the camera. The principles of operant conditioning are reviewed.

Discovering Psychology, Part 8: Learning (1990, 30 min, ANN/CPB). Covers the basic principles of operant and classical conditioning through a focus on the contributions of Pavlov, Watson, Skinner, and others.

The First Signs of Washoe (1974, 58 min, NOVA). Washoe, Lana, and their chimpanzee friends have cameos in this exploration of attempts to teach communication skills. No speaking parts, though...

Further Approaches to Learning (57 min, FHS). Latent learning, learning sets, social learning, ethology, cognitive theories, and neuroscience get their just rewards in this exploration of "other" types of learning.

Genie (1991, 60 min, PBS). This powerful video recounts the real-life story of "Genie," a young girl living a feral existence in a disturbed home environment. UCLA psychologists and linguists recount the slow process of working with Genie to produce advances in learning and social adaptation. Actual footage of laboratory studies is used. Worth seeing; applicable in a variety of contexts.

Learning (23 min, FHS). Facets of the learning process--a newborn's adaptation to the environment; a young adult's apprenticeship training; use of educational software; the relationship between aging and learning--form the basis of this video.

Learning (1990, 30 min, IM). Provides an overview of classical and operant conditioning, including applications in treating hyperactivity and an interview with Skinner.

Learning and Behavior (1960, 26 min, IM). B. F. Skinner and F. J. Herrnstein present their views on how all learning is dependent upon rewarded responses. A consideration of measuring the learning process is also included.

Learning Disabilities (19 min, FHS). Diagnosing, evaluating, and treating learning disabilities are the focus of this video. The daily routine of a 9-year-old boy and his problems are featured.

Observational Learning (1978, 23 min, HRM). Clips from Bandura's Bobo doll experiments as well as examples of emotional conditioning, television violence, and other aspects of observational learning are presented.

One Step At A Time: Introduction to Behavior Modification (1973, 32 min, IM). A children's mental health facility is the setting for this examination of different behavior modification techniques. Positive reinforcement combined with early prevention are discussed as keys to minimizing future mental health problems.

Pavlov: The Conditioned Reflex (1975, 25 min, FHS). Using rare footage of Pavlov, this program focuses on his early research and its subsequent applications.

The Question of Learning (1974, 50 min, WGBH). Recreations of Pavlov's studies, "Clever Hans," and several other animal experiments illustrate the concepts of classical and operant conditioning.

Reward and Punishment (1974, 14 min, IM). The systematic use of reward and punishment in shaping the behavior of children is presented in a series of photographs. The advantages and disadvantages of reward versus punishment are considered.

Schools of Thought: Teaching Children in America and Japan (55 min, FHS). A comparison of Japanese and American educational systems forms the basis for answering questions such as why differences in education exist and how personal and cultural goals influence the education process.

Signs of the Apes, Songs of the Whales (1983, 57 min, WGBH). Washoe is revisited 10 years after his first starring role (see The First Signs of Washoe). An older, wiser ape, he shares the spotlight with dolphins, gorillas, and sea lions.

Skinner's Keynote Address: Lifetime Scientific Contribution Remarks (1990, 18 min, APA). In Skinner's last public appearance before his death, he emphasizes that psychology's true object of study is the analysis of behavior. In reaching this conclusion he addresses the various paths psychologists have followed, including introspection, natural selection, and variants of conditioning and learning.

Stimulus Response in Animals (1996, 33 min, FHS). A menagerie of animals is shown illustrating the basic principles of operant conditioning. Hens, calves, and pigs display autonomic and learned responses on demand.

Token Economy: Behaviorism Applied (1972, 23 min, IM). Skinner explains the basics of positive reinforcement and punishment and discusses applications using a token economy.

Washoe: Monkeys and Sign Language (1996, 52 min, FHS). This celebrated chimp is featured in this look at communication abilities among primates.

Why Can't I Learn? (1989, 30 min, COR/MTI). Sir Winston Churchill, Thomas Edison, and Cher are high achievers who've each earned their fame by overcoming a learning disability. Points out how many LD children are going unidentified and untreated, while others in special programs graduate without knowing how to add, read, or write. An enlightening exploration into dyslexia and the current controversy over educating this special population.


transparencies series IV

5.1 Pavlov's Procedure for Classical Conditioning
5.2 Apparatus Used for Classical Conditioning of Salivation in Dogs
This apparatus devised by G.F. Nicolai, one of Pavlov's students, allowed precise measurement of salivation. Pavlov did not use this apparatus in his earliest studies, but he later used a similar apparatus.
5.3 Classical Conditioning of an Eyeblink Reflex
This conditioned reflex has been studied in human and nonhuman subjects. The apparatus allows the delivery of a puff of air to the eye through a small tube. A thin, light band is taped to the eyelid, and the other end of the band is connected to a switch. Closing the eyelid also closes the switch, and this results in a record of the eye blink.
5.4 Types of Classical Conditioning
The temporal arrangements between the CS and US for delayed, trace, simultaneous, and backward conditioning are illustrated.
5.5 Interstimulus Interval and Strength of Conditioned Response
This graph presents the relationship between the strength of classical conditioning (measured as percentage of CR's following CS's) with varying lengths of interstimulus interval (delay between presentation of CS and US).
5.6 Acquisition, Extinction, and Spontaneous Recovery
This graph illustrates the typical pattern of changes in conditioned response strength in a series of trials that includes acquisition, extinction and spontaneous recovery.
5.7 Stimulus Generalization in Classical Conditioning
The experimental procedures used to demonstrate stimulus generalization in classical conditioning.
5.8 Generalization Gradient
Results showing the generalization of conditioned eyeblink reflexes in rabbits to tones which vary in pitch from the original CS.
5.9 Stimulus Discrimination Training in Classical Conditioning
The experimental procedures used to produce stimulus discrimination in classical conditioning.
5.10 Second-Order Conditioning
The experimental procedures used to train second-order conditioning in classical conditioning.
5.11 Contiguity or Contingency?
This transparency illustrates two CS-US relations that share the same contiguity but differ in the information the CS gives about the US. The procedure in the top half of the transparency results in much less conditioning.
5.12 Sensory Preconditioning
Sensory preconditioning illustrates that the association of two neutral stimuli, prior to presentation of any US, results in learning.
5.13 Blocking
The blocking phenomenon provides evidence for the view that conditioning occurs only when a stimulus provides information about US. Contiguity is not enough.
5.14A Thorndike Puzzle Box
Graph shows that the cat learns to make the response required for escape more rapidly as the number of trials increases.
5.15 Operant Conditioning Chamber---A Skinner Box
The Skinner Box is designed to study operant conditioning.
5.16 Training Procedures Used in Operant Conditioning
This table presents the training procedures used in positive reinforcement, negative reinforcement, punishment, and omission training (response cost or time out from reinforcement).
5.17 Characteristics of Schedules of Reinforcement
The experimental procedures, typical patterns of responding, and effects on extinction are described for fixed ratio, fixed interval, variable ratio, and variable interval schedules.
5.18 The Partial Reinforcement Effect
A graph of the typical pattern of response rates during operant extinction for subjects trained with either continuous or intermittent (partial) reinforcement.
5.19 Reinforcement Schedules: Typical Response Patterns
Typical response patterns on a cumulative recorder for fixed ratio, fixed interval, variable ratio, and variable interval schedules. Each time a response is made the height of the pen is raised. The pen marks continuously on a moving piece of paper, so the first responses are on the left side of the graph. Steeper lines indicate faster rates of responding. The diagonal marks indicate the delivery of a reinforcer.
5.20 Evidence for Latent Learning
Rats in Group A never received reinforcement for running the maze, so made little change in the number of errors between the start box and goal. Rats in Group B received reinforcement on all trials and showed a steady decrease in the number of errors made. Rats in Group C received no reinforcement for maze learning in the first 10 trials, but did receive reinforcement thereafter. They showed an immediate change in performance as soon as reinforcement was initiated. These rats clearly learned something during the first 10 trials, but this learning was latent until reinforcement was made available.


transparencies series V

  1. A Thorndike "Puzzle Box"
    Fun with cats; show your students how to get out of a box.
  2. Apparatus Used to Study Classical Conditioning of Salivation in Dogs
    Fun with dogs; show your students how to spit in a tube.
  3. Pavlov's Procedure for Classical Conditioning
    Pavlov's basic design for studying US and CS is shown.
  4. Possible Sequences for Presenting the CS and US in Classical Conditioning
    Trace conditioning, delayed conditioning, simultaneous conditioning, and backward conditioning are illustrated.
  5. Stimulus Generalization in Classical Conditioning
    This transparency shows how stimulus generalization occurs in classical conditioning.
  6. Stimulus Discrimination Training in Classical Conditioning
    This transparency shows how stimulus discrimination occurs in classical conditioning.
  7. Operant Conditioning Chamber – A Skinner Box
    The apparatus used to study operant learning is shown.
  8. Training Procedures Used in Operant Conditioning
    The outcomes of positive and negative reinforcement, punishment, and omission are depicted.
  9. Characteristics of Schedules of Reinforcement
    Fixed and variable interval and ratio schedules are compared.
  10. Reinforcement Schedules: Typical Response Patterns
    The behavior resulting from learning under fixed interval, fixed ratio, variable interval, and variable ratio schedules are shown.
  11. Evidence for Latent Learning
    Latent learning in rats is illustrated using the Tolman-Honzik study.

Handout 6-1

Defining Learning

Instructions. For each of the following events, determine whether or not it represents an example of learning.

  1. The cessation of thumb sucking by an infant.
  2. The acquisition of language in children.
  3. A computer program generates random opening moves for its first 100 chess games and tabulates the outcomes of those games. Starting with the 101st game, the computer uses those tabulations to influence its choice of opening moves.
  4. A worm is placed in a T-maze. The left arm of the maze is brightly lit and dry; the right arm is dim and moist. On the first 10 trials, the worm turns right 7 times. On the next 10 trials, the worm turns right all 10 times.
  5. Ethel stays up late the night before the October GRE administration and consumes large quantities of licit and illicit pharmacological agents. Her combined (verbal plus quantitative) score is 410. The night before the December GRE administration, she goes to bed early after a wholesome dinner and a glass of milk. Her score increases to 1210. Is the change in scores due to learning? Is the change in pretest regimen due to learning?
  6. A previously psychotic patient is given Dr. K's patented phrenological surgery and no longer exhibits any psychotic behaviors.
  7. A lanky zinnia plant is pinched back and begins to grow denser foliage and flowers.
  8. MYCIN is a computer program that does a rather good job of diagnosing human infections by consulting a large database of rules it has been given. If we add another rule to the database, has MYCIN learned something?
  9. After pondering over a difficult puzzle for hours, Jane finally figures it out. From that point on, she can solve all similar puzzles in the time it takes her to read them.
  10. After 30 years of smoking two packs a day, Zeb throws away his cigarettes and never smokes again.

Reprinted with permission from T. Rocklin, 1987, Defining learning: Two classroom activities, Teaching of Psychology, 14, 228-229. Copyright 1987 by Lawrence Erlbaum Associates, Inc.

Handout 5-2

Elements of Classical Conditioning

Instructions. For each scenario presented below, identify the four major elements of classical conditioning. Specify for each example (a) the unconditioned stimulus (US), (b) the unconditioned response (UR), (c) the conditioned stimulus (CS), and (d) the conditioned response (CR).

  1. To discourage coyotes from attacking their sheep, ranchers feed the coyotes small pieces of mutton tainted with poison that, when ingested, cause the coyotes experience extreme dizziness and nausea. Later, when the coyotes are placed in the pen with the sheep, the mere smell of the sheep causes the coyotes to run frantically away from their former prey.
  2. As part of a new and intriguing line of research in behavioral medicine, researchers give mice saccharine-flavored water (a sweet substance that mice love) and then follow it up with an injection of a drug that weakens mice's immune systems. Later, when these mice drank saccharine-flavored water, they showed signs of a weakened immune response. Research is currently underway to see if the reverse is possible (i.e., if conditioning can be used to increase immune functioning), a discovery which would surely have important implications for new medical treatments.
  3. A passenger on an airplane was feeling very anxious about an important job interview the next morning, and as a result he was uneasy and nervous throughout the flight. Back at home weeks later, he is contemplating a holiday trip. Though he hadn't previously been afraid to fly, he finds himself suddenly nervous about flying and decides to cancel his plans to visit an out-of-state relative.
  4. It's no secret that people become unhappy when bad weather strikes, but what is surprising is that weather forecasters are consistently blamed for weather over which they obviously have no control. Weather forecasters around the country have been wacked by old ladies with umbrellas, pelted with snowballs, and even threatened with death (e.g., "You're the one that sent that tornado and tore up my house...I'm going to take your head off!", or "If it snows over Christmas, you won't live to see New Year's!") by people who mistakenly infer a causal relationship between the forecaster with subsequent foul weather patterns.
  5. Why is it that automobile advertisements--especially those for sports cars--often feature beautiful young women? Because smart advertisers know (and research confirms) that new car ads that include an attractive female are rated by men as faster, more appealing, better-designed, and more desirable than similar ads that do not include an attractive female.

Handout 5-3

Reinforcement vs. Punishment

Instructions. For each example presented below, identify whether positive reinforcement (PR), negative reinforcement (NR), or punishment (PUN) is illustrated by placing the appropriate abbreviation in the blank next to the item.

_____ 1. Police pulling drivers over and giving prizes for buckling up

_____ 2. Suspending a basketball player for committing a flagrant foul

_____ 3. A soccer player rolls her eyes at a teammate who delivered a bad pass

_____ 4. A child snaps her fingers until her teacher calls on her

_____ 5. A hospital patient is allowed extra visiting time after eating a complete meal

_____ 6. Receiving a city utility discount for participating in a recycling program

_____ 7. Grounding a teenager until his or her homework is finished

_____ 8. Scolding a child for playing in the street

_____ 9. A prisoner loses TV privileges for one week for a rule violation

_____ 10. A parent nagging a child to clean up her room

_____ 11. A rat presses a lever to terminate a shock or a loud tone

_____ 12. A professor gives extra credit to students with perfect attendance

_____ 13. A dog is banished to his doghouse after soiling the living room carpet

_____ 14. A defendant is harassed and tortured until he confesses

_____ 15. A young child receives $5 for earning good grades in school

_____ 16. A mother smiles when her child utters "Mama"

_____ 17. A child is put into "time out" for misbehaving

_____ 18. Employee of the month gets a reserved parking space

_____ 19. At a party, a husband becomes sullen when his wife flirts with a colleague

_____ 20. A woman watching a football game offers her child candy to play quietly

Handout 5-4

Schedules of Reinforcement

Instructions. Identify the reinforcement schedule illustrated in the following examples by placing the appropriate abbreviation in the blank next to the item. Use the following code:

Fixed Ratio (FR)

Variable Ratio (VR)

Fixed Interval (FI)

Variable Interval (VI)

_____ 1. Getting a paycheck every other week

_____ 2. Pop quizzes

_____ 3. Slot machines at gambling casinos

_____ 4. Calling the mechanic to find out if your car is fixed yet

_____ 5. A factory worker who is paid on piece work

_____ 6. Fly fishing: casting and reeling back several times before catching a fish

_____ 7. Looking at your watch during a lecture until the end of the lecture

_____ 8. A salesperson who gets paid on commission

_____ 9. Calling a friend and getting a busy signal because he or she is frequently on the phone

_____ 10. Signaling with your thumb while hitchhiking

_____ 11. Frequent flyer program: rewards after flying X amount of miles

_____ 12. Collecting bottles, cans, or other recyclables for cash

_____ 13. An athlete's contract specifies salary increases to be renegotiated every three years

_____ 14. Buying lottery tickets

_____ 15. A person refrains from drugs for fear of random drug testing

_____ 16. Checking the refrigerator to see if the JELL-O is ready

_____ 17. Watching for shooting stars

_____ 18. Checking the mail, assuming the mail carrier comes at the same time every day

_____ 19. Playing Bingo


© 1999-2000 by Prentice-Hall, Inc.
A Pearson Company
Distance Learning at Prentice Hall
Legal Notice