The Premack principle arises in the context of operatic conditioning. And it supports the existence of a psychological dimension that is determining the repetition or disappearance of behavior. This measurement is the value that an individual contributes to a particular event generated by their interaction with the event.

This principle was one of the great tenets of operators in the mid-twentieth century. It broke the traditional definition of “reinforcement,” which had important implications in models of training and motivation studies.


Premack Principle: Definition and Origin

Between 1954 and 1959, the North American psychologist David Premack and his wife and collaborator Ann James Premack conducted various studies of the operant’s conditions. And analyzed the behavior of monkeys belonging to the genus Cebus.

The research was initially conducted at the Yerx Primate Biology Laboratory in Florida. At the University of Missouri, Columbia, at the University of California and finally at the University of Pennsylvania.

Premack’s hypothesis was as follows. Any A response would back up any answer B. If and only if the probability of an A response appearing is greater than the probability of a B response. They wanted to prove that a different response could amplify a rare behavioral respons. If the latter implied a greater preference over the former.

In other words, the Premack principle states that if there is behavior or activity of little interest, it is likely that such behavior does not spontaneously occur. However, if immediately after that, there is an opportunity to perform another behavior or activity that is of interest. Thus the first (the one that is of little interest) will significantly increase the probability of repetition.

The principle of Premack Contribution to operant air conditioning

In Skinner’s operant conditioning, backers are stimuli that tend to increase the frequency of the behavior. Thus, the very definition of “reinforcing” was given by its influence on behavior, by which any incentive could amplify behavior whenever it acted. It is done that the effort itself was at the center of efforts to increase any behavior.

But, testing Premack’s hypothesis, Skinner’s theory of operant conditioning takes an important turn: by not functioning absolutely, the amplifiers work relatively.

The amplifier does not matter in itself; it is important how many possibilities of response the individual offers. In this sense, what determines the effect of an event is the value that the subject attributes to the event itself. For this theory, the central answers are that what increases the appearance of behavior is not so much “reinforcement” as a series of “reinforcing events.”

The theory of deprivation of the answer

Subsequently, other experiments and studies conducted in operatic conditioning have called into question the Premack principle.

Among them is the theory of deprivation of the answer. In a broad sense, this suggests that there are situations in which restricting access to an amplifying response does not increase the preference for an effective response, as it is. Increase motivation first, and therefore several behaviors associated with it. In short, this assumes that the less you can access behavior, the more motivation you generate.

Value according to this theory

According to Pereira, Caicedo, Gutierrez, and Sandoval (1994), given the importance that the Premack’s principal attributes the motivation generated by supporting events. One of the central concepts in Premack’s principle is “value,” the definition of which can be summarized and defined as follows:

Organisms organize world events according to a hierarchy of values.

The value is measured by the probability that the body reacts to an irritant. In turn, the probability can be measured by the duration of interaction with the specified response. That is, the more time it takes to act, the greater the value it has to the individual.

If a more valuable event is presented immediately after another less valuable one, the latter’s behavior is enhanced. Similarly, the least valuable event and behavior that interferes with it acquire “instrumental” value.

If the opposite effect occurs (the lower-value event occurs immediately after the higher value), what happens is the punishment for instrumental behavior. This means that the probability of repeating the least meaningful behavior decreases.

Similarly, “value” is defined as a psychological dimension that individuals assign to events, such as assigned other properties (e.g., size, color, weight). In the same sense, the value is assigned according to a specific interaction that the individual sets with the event.

This psychological dimension determines the likelihood of the occurrence or disappearance of behavior, i.e., the effect of reinforcement or punishment. Thus, to make sure that behavior occurs or expires, it is important to analyze the value that the individual attributes to it.

The above implies an analysis of the individual’s current and previous interactions with an event that needs to be enhanced and the ability to generate other responses or events.

Experiment with pinball and sweets

To sum it all up, we described an experiment that David Premack and his collaborators conducted with children. In the first part, they were presented with two alternatives (which were called “answers”): eating candy or playing a pinball machine.

Thus, it was possible to determine which of these two behaviors was more likely to be repeated for each child (and with this, the level of preference was determined).

In the second part of the experiment, children were told that they could eat candy if they first played a pinball machine. Thus, “eating candy” was a reinforcement of the answer, and “playing with a pinball machine” was an effective response. The result of the experiment was that only children who preferred to “eat candy” were less likely to support their behavior or be less interested in “playing with a pinball machine.”