B.F.SKINNER THEORY:
B. F. Skinner’s entire system is based on operant conditioning. The organism is in the process of “operating” on the environment, which in ordinary terms means it is bouncing around its world, doing what it does. During this “operating,” the organism encounters a special kind of stimulus, called a reinforcing stimulus, or simply a reinforcer. This special stimulus has the effect of increasing the operant -- that is, the behavior occurring just before the reinforcer. This is operant conditioning: “the behavior is followed by a consequence, and the nature of the consequence modifies the organisms tendency to repeat the behavior in the future.”
Imagine a rat in a cage. This is a special cage (called, in fact, a “Skinner box”) that has a bar or pedal on one wall that, when pressed, causes a little mechanism to release a food pellet into the cage. The rat is bouncing around the cage, doing whatever it is rats do, when he accidentally presses the bar and -- hey, presto! -- a food pellet falls into the cage! The operant is the behavior just prior to the reinforcer, which is the food pellet, of course. In no time at all, the rat is furiously peddling away at the bar, hoarding his pile of pellets in the corner of the cage.
A behavior followed by a reinforcing stimulus results in an increased probability of that behavior occurring in the future.
What if you don’t give the rat any more pellets? Apparently, he’s no fool, and after a few futile attempts, he stops his bar-pressing behavior. This is called extinction of the operant behavior.
A behavior no longer followed by the reinforcing stimulus results in a decreased probability of that behavior occurring in the future.
B. F. Skinner’s entire system is based on operant conditioning. The organism is in the process of “operating” on the environment, which in ordinary terms means it is bouncing around its world, doing what it does. During this “operating,” the organism encounters a special kind of stimulus, called a reinforcing stimulus, or simply a reinforcer. This special stimulus has the effect of increasing the operant -- that is, the behavior occurring just before the reinforcer. This is operant conditioning: “the behavior is followed by a consequence, and the nature of the consequence modifies the organisms tendency to repeat the behavior in the future.”
Imagine a rat in a cage. This is a special cage (called, in fact, a “Skinner box”) that has a bar or pedal on one wall that, when pressed, causes a little mechanism to release a food pellet into the cage. The rat is bouncing around the cage, doing whatever it is rats do, when he accidentally presses the bar and -- hey, presto! -- a food pellet falls into the cage! The operant is the behavior just prior to the reinforcer, which is the food pellet, of course. In no time at all, the rat is furiously peddling away at the bar, hoarding his pile of pellets in the corner of the cage.
A behavior followed by a reinforcing stimulus results in an increased probability of that behavior occurring in the future.
What if you don’t give the rat any more pellets? Apparently, he’s no fool, and after a few futile attempts, he stops his bar-pressing behavior. This is called extinction of the operant behavior.
A behavior no longer followed by the reinforcing stimulus results in a decreased probability of that behavior occurring in the future.
No comments:
Post a Comment