Category Archives: Intro

Psychoanalysis – 1. Theories

 

Psychoanalysis was originated as a series of psychological and psychotherapeutical theories and techniques by Sigmund Freud (1856 – 1939) and developed into a school of psychology with the contributions of Freud’s successors including Carl Jung (1875 – 1961), Anna Freud (1895 – 1982), and Erik Erikson (1902 – 1994).

 

1. FUNDAMENTAL THEORIES

1.1. The Apparatus of the Mind

The theory of mental apparatus presents the human psyche as a working system of three components.

Mind apparatus 2.png

Image 1. A visualization of mind apparatus based on Freud’s description (1938).


 

  • Id: the oldest component, holding all the biological instincts that the person inherits from birth.
  • Ego: stemming from Id, developing on the person’s psychological growth.
  • Super-ego: sprouting from ego; growing on social standards, especially parental influences in the person’s early life.
  •  

    Out of the three parts, ego functions with the present while the others are strongly linked with the past – Id with nature, and super-ego with nurture. While the roles of Id and super-ego seem as simple as storing biological and social inheritance, ego is described to have a more proactive function. Firstly, it reconciles the contrast between the motivations from Id (desires) and from super-ego (ethics) in dealing with the person’s urges and wishes. Secondly, it acts as an agent between the person’s inner world and the outer world: saving external experiences into memory and retrieving them as the situation requires, making decisions by considering both the person’s need and the situation, and taking actions in order to protect the self from harm (by flight) as well as to optimize pleasure and advantages (by adaptation and activities).

     

    1.2. The Theory of Instincts

    With what forces and energies does the mental apparatus operate?

    The theory of instincts

    Image 2. The two instincts depicted as two wheels which generate respective energies to create and loosen bonds.

    So pondered Freud, and he answered: there are two fundamental forces underlying our mind, love instinct and destructive/death instinct. The former, also named as Eros, goes with an energy of love called libido, uniting things to preserve the self and the species. The later goes with an energy of destruction, breaking things apart and causing chaos.

    It can be seen that the two instincts represent the contrasting forces which co-exist in all aspects of the universe: peace and aggression, union and separation, attraction and repulsion. Despite being opposites, they complement each other in the movements of our lives. For example, the act of eating itself means not only the destruction of what we eat (separation), but also the incorporation of it into our body (union).  The ultimate aims of the two instincts, as understood from Freud’s view, are two sides of the life cycle, in which one continues the survival of the species into the future, and the other reduces the species into their earlier inanimate states.

     

    1.3. Sexual Development

    Freud’s theory of sexual development challenged many traditional views about sexuality in children. First of all, he drew a distinction between the concepts “sexual” and “genital”. While traditional thinking equals “sexual” with genital desire, which is only aroused at puberty, Freud viewed sexual desires simply as urges that bring us bodily pleasure, and believed that they exists in us since infancy in various forms rather than just genital. The sexual desires develop over five stages as followed.

    Psyschosexual stages

    Image 3. Key activities across the five stages: sucking (oral), defecation (anal), masturbation (phallic), non-sexual                          activities (latency), and sexual engagement (genital).

    • Oral stage. Starting with birth, the sexual need of this period links to the act of breastfeeding, which is originally for pure survival but gradually becomes a desire of possession. When the child is no longer breastfed, they develops the habit of sucking things, especially their thumb as this is the most convenient object, as an attempt to revive their first impression of pleasure.
    • Anal stage. In this period, the child is given bowel training. They find the pleasure in releasing the content of his bowel, which can be seen as their first possessions to offer to the external world. Sometimes they choose to withhold defecation for a while in order to enjoy the pleasure of this act later to its best.
    • Phallic stage. The child initiates an interest in their own sex and their genital organ. According to Freud, boys at this stage enjoy playing with their penis and experience oedipus complex, in which they feel an attraction to their mother and envy against their father, unconsciously finding the former an object of sexual desire and the later a rival of masculinity. Carl Jung, one of Freud’s students, later proposed the term “electra complex” for the comparable phenomenon in girls at phallic stage (feeling attracted by their father and envious of their mother).
    • Latency stage. Sexual urges become inactive; the child’s interest is shifted toward other activities outside their body.
    • Genital stage. The final sexual urge awakes at puberty and leads to activities of mating and reproduction.

     

    1.4. Mental Qualities

    The theory of mental qualities assumes that our mental activities evolve between three different states: conscious – our ongoing thought, preconscious – our past thought and memory which we can easily retrieve to consciousness, and unconscious – our buried thought and memory which only reveal themselves in our dreams or through a special therapeutic process. While the conscious activities seem most relevant to our life, Freud argued that they only remain conscious for a brief time before all becoming preconscious and/or unconscious, and hence the conscious occupies a very small portion of our mental world. Specifically, he posited that only peripheral processes in the ego are conscious, while most activities of the ego and all of the super ego are preconscious and unconscious, and the entire Id is governed by the unconscious.

    Mental qualities

       Image 4. The mental processes are symbolized as an ice berg: only a small peak is seen (conscious), while the                                       rest is hidden under water (preconscious and unconscious).

    Also according to Freud, exploring the unconscious would help uncover a lot of unresolved emotional issues which are responsible for mental disorders and crises in people. This thought was actually applied into his therapeutic practice, in which the psychiatrist managed to trace his patient’s current problems to repressed feelings of the past. And in exploring the unconscious, one of the sources Freud turned to was dreams.

     

    1.5. The Theories of Dreams

    Freud was one of the first people to inspect human dreams as a form of psychological manifestation, and furthermore, to build a system of theories about them based on his actual practice of dream analysis. These dream theories and dream analysis experiences are elaborated in his two books The Interpretation of Dreams (1900) and On Dreams (1914). Generally, his view is that dreams are unconscious expressions of desires, but since most of these desires are repressed by the person in real life, they are disguised under irrelevant or chaotic dream stories.

    A dream, according to Freud, consists of three layers. The core, latent content, holds the dreamer’s desire, fed from their Id (unconscious desires) or ego (conscious or preconscious desires). The middle layer is a complex mechanism called dreamwork, which create a disguise for the latent content by condensation – concentrating the intricacy of the desire into simplistic events, displacement – replacing the desire with a different event, and representation – producing events or images that symbolize the desire in an abstract manner. From the dreamwork generates the outer layer of manifest content, which, typically in adults’ dreams, often misleads us from the true, latent content. Freud believes that in order to understand one’s dream, special dream analysis techniques are required to interpret the condensed stories and symbols into the person’s concealed desires.

    Obviously, not all desires are disguised in dreams. Freud noticed that small children’s dreams often manifest their true, childlike wishes, likely because these dreamers tend not to conceal or repress their desires. Such dreams, in which manifest content and latent content are similar, make up Class 1 – dreams of meaningful stories and undisguised desires, reflecting the person’s life of the day. Class 2 includes dreams of meaningful stories as well but these stories appear weird and inconsistent with the person’s thought and feelings, and Class 3 includes dreams of incoherent content which are incomprehensible without a systematic dream analysis. The dreams of both Class 2 and Class 3 have their latent content disguised under a manifest content by dreamwork.

    Dreams.png

    Image 5. A diagrammatic summary of Freud’s conceptualization of dreams: 1) The structure of a dream (left), and 2) Three classes of dreams.


     

    If dreams are indeed so complicated, how did Freud decode his patients’ dreams and even thereby resolve emotional issues? The next part of PsychonalysisApplications will look into Freud’s techniques of dream analysis, as well as other therapeutic methods that the psychiatrist employed in his days.

     
     


    References:

    Freud, S. (1913). The Interpretation of Dreams (3rd Ed., A. A. Brill, Trans.). New York, NY: The Macmillan Company. Retrieved from http://www.psywww.com/books/interp/toc.htm

    Freud, S. (1914). On Dreams (M. D. Eder, Trans.). New York, NY: Rebman. Retrieved from https://archive.org/details/ondreams1914freu

    Freud, S. (1940). An outline of psycho-analysis.  International Journal of Psycho-Analysis, 21. 27-84. Retrieved from http://icpla.edu/wp-content/uploads/2012/10/Freud-S.-An-Outline-of-Psychoanalysis-Int.-JPA.pdf

    Freud, S. (2000). Three Essays on the Theory of Sexuality (J. Strachey, Trans.). New York, NY: Basic Books.

     
     

    Parenting Styles (Baumrind, 1966)

     

    In respect of parenting, Diana Baumrind’s works on parenting styles are among the earliest milestones in developmental psychology. They started with Baumrind’s detailed explorations into the parent-children relationships of various families in Berkeley, the US (Baumrind, 1967; Baumrind, 1971), and with her proposition of three parenting patterns – AuthoritarianAuthoritative, and Permissive.  

    Table 1 compares and contrasts the three parenting styles in accordance to Baumrind (1966, 1971). Overall, authoritarian parenting and permissive parenting are two opposites, with the former exerting absolute control over children and the later yielding to children’s individuality, while authoritative parenting settles a balance between freedom and restrictions. Although both place high demands on children, authoritarian parents tend to base their standards on traditional beliefs, whereas authoritative parents rationalize their rules to meet the purposes of discipline as well as their children’s autonomous compliance.

    Parenting styles.png

    Table 1. A comparison between Baumrind’s three parenting styles.

     

    In her later works, Baumrind mentioned two other parenting styles – Nonconforming and Harmonious. Nonconforming parenting was described as promoting individuality and independence in children, and at the same time expecting them to strive for excellency (Baumrind, 1971b). It can be understood that these parents exert low control – as opposed to authoritarian and authoritative ones – but hold high demands – as opposed to permissive ones.

    Harmonious parenting was characterized with its focus on resolving discords within the home to conserve a peaceful environment which equally benefits all family members. Although there is certain equality in the interactions between harmonious parents  and their children, these parents maintain their adult roles, while permissive and nonconforming ones may neglect these and behave childishly. Finally, unlike nonconforming parents, harmonious parents remain agreeable with the mainstream. (Baumrind, 1971a; Baumrind, 1971b)

    Perhaps due to Baumrind’s inadequate samples and a lack of supporting evidence provided by successive studies for nonconforming parenting and harmonious parenting,  current documentation refers to Baumrind’s theory of parenting styles as including only three patterns – authoritarian, authoritative and permissive. The effects of these three patterns of upbringing on children’s development have been examined by a substantial amount of research, including several by Baumrind herself. This topic will be explored in an upcoming entry at PsychPics’ Findings.

     


     

    References:

    Baumrind, D. (1966). Effects of authoritative parental control on child behavior. Child Development, 37 (4), 887-907.

    Baumrind, D. (1967). Child care practices anteceding three patterns of preschool behavior. Genetic Psychology Monographs, 75 (1), 43-88.

    Baumrind, D. (1971a). Harmonious parents and their preschool children. Developmental Psychology,  4 (1). 99-102.

    Baumrind, D. (1971b). Current patterns of parental authority. Developmental Psychology Monograph, 4 (1). 1-103.

     
     

    Behaviorism – 2. Principles

     

    The philosophy of behaviorism was discussed for the first time by John Watson, who co-authored the Little Albert Experiment in 1920 (see Behavioral Psychology – 1. Historical progress), with his essay “Psychology as the behaviorist views it” (1913). In this writing, Watson argued that the science of psychology needed to be studied in a systematic, objective manner, taking in account only observable behaviors. This perspective was later elaborated on a variety of relevant matters – genetics, instinct, feelings, introspection, conscious, personality, and so on – in his book Behaviorism (1925).

    Watson’s view of behaviorism emphasizes that:

    • Behaviorism is a pure natural science, which is best related with physiology
    • Behaviorism utilizes only objective research methods, which are experiments with the addition of systematic observation.
    • Only overt behaviors should be considered.
    • Human behaviors should be examined on the same level with animal behaviors.
    • Conditioning is a key to exploring and explaining behaviors.
    • Heredity is of little significance to human development, except in the cases of disabled people. Given appropriate conditions, any healthily born children can be shaped in a chosen way regardless of dispositions.

    Despite certain objectives from his contemporaries (including the psychoanalyst Sigmund Freud), Watson’s approach was adopted by many other psychologists. Some of these successors did not totally agree with Watson’s view and generated their own direction of behaviorism, such as Clark Hull with purposive behaviorism and Edward Tolman with drive theory. The best known of all was perhaps B. F. Skinner, who not only expanded the theoretical and practical scope of behaviorism by his introduction of operant conditioning (see Behavioral Psychology – 1. Historical progress), but also revised the philosophy of behaviorism by providing supplementary arguments for Watson’s standpoints in his own work About behaviorism (1974).

    Table 1 outlines and compares the opinions of the two behaviorists on the key issues. Skinner strongly supported Watson in the matter of introspection or mental explanation. His reasoning extended to clarifying that as our self-knowledge, conscious and self-control are also within the restraints of stimulus conditions surrounding us, introspection is a very unreliable method for explaining our behaviors. On amending Watson’s statements, Skinner redefined behaviorism as a philosophy of the science of human behavior instead of the science itself, and shed a light into complex factors behind human behavior rather than simply equaling it with animal behavior.

    Principles

    Table 1. A comparative summary of Watson’s and Skinner’s views of behaviorism

     

    With Watson’s and Skinner’s stances combined, the principles of behaviorism can be summarized as followed:

    • Observation focus on overt behaviors and strictly no introspection.
    • Research methods are objective, systematical, and experimental.
    • Physiological explanations are important.
    • The control of environmental stimuli is highlighted over heredity and free will.
    • Conditioning is the key concept and method.

     


    References: 

    Watson, J.B. (1913). “Psychology as the behaviorist views it.” Psychological Review, 20: 158–177.

    Watson, J.B. (1925). Behaviorism. New York, NY:  Kegan PaulTrenchTrubner & Co.

    Skinner, B.F. (1974). About Behaviorism. New York: Alfred A. Knopf

     

    Behaviorism – 3. Methods and Applications

     

    Current applications of behaviorism are based on the three discoveries mentioned in Part 1: classical conditioning, operant conditioning, and social learning.

    1. CLASSICAL CONDITIONING

    1.1. Basic Concepts

    • Classical conditioning is defined as a learning process where an involuntary response (i.e., fear) is formed by the association of two stimuli. Between two stimuli, one is able to elicit the target response before conditioning (hence called unconditional stimulus – US), while the other is not (hence called conditioned stimulus – CS). The target response is referred to as unconditional response (UR) when coming with the US, and as conditioned response (CR) when coming with the CS after conditioning. The process is formulated simply as followed:

    (1)     US           —–>  UR

    (2)     CS          —–>  no response

    (3)     US + CS —–>  UR      (times)

    (4)     CS          —–>  CR

    • Extinction is the reverse process of conditioning: When the CS is repetitively and consistently presented without the company of the US, the formed association between the two stimuli weakens, until eventually the presence of the CS no longer elicits the CR.
    Temporal papradigms

    Figure 1. Temporal paradigms in classical conditioning

    1.2. Temporal Paradigms

    Temporal paradigm in classical conditioning involves the order and the timing of the stimulus presentation. There are five paradigms: simultaneous (1-s delay), simultaneous (exact), delayed, trace, and backward, as illustrated in Figure 1.

    • Simultaneous (1-s delay): The US is presented around one second after the CS and while the CS still remains. This paradigm is considered the most efficient to produce an excitatory effect (the UR is elicited at the sight of the CS).
    • Simultaneous (exact): The US and CS are presented at exact the same time. According to Pavlov (1927),  the US would be gradually taken as a signal for the end of the CS, and hence cause an inhibitory effect (the UR stopped at the sight of the CS). Some other studies, however, reported that this paradigm caused a weak excitatory effect.
    • Delayed: The US is presented seconds or even minutes after the CS and while the CS is still presented. In this case, the CR occurred only after the CS had been presented for a while, and continued with increasing intensity until the US appeared.
    • Trace: The CS appears then disappears, and a while later, the US is presented. Like in delayed conditioning, the CR occurred only after the CS disappeared for a while, and continued with increasing intensity until the US appeared. In this case, the CR was associated with a memory trace of the CS.
    • Backward: The US is presented before the CS. This paradigm was generally found to cause inhibition.

    1.3. Applications

    • Aversion therapy is an application of classical conditioning in eliminating unwanted behaviors by associating the target behavior with an unpleasant experience. The approach is typically employed in treating addictions (alcoholism, smoking, drug dependence, gambling), bad habits  (overeating, self-harm behaviors) and violent behaviors. Its methods include chemical aversion therapy (administering nausea-inducing drugs to the patients together with their addictive substances), covert sensitizations (showing patients aversive images along with pictures of the unwanted behaviors), and even electric shock.
    • Counterconditioning is the application of inhibition and extinction in eliminating a negative feeling associated with a stimulus. It is typically used in treating phobias and anxiety disorders. There are two techniques of counterconditioning.
      • Systematic desensitization. Developed by Wolpe (1958, 1961), the technique comprises three steps. First, the patient and the psychologist work together to define all the stimuli which provoke the anxiety. Second, the patient is instructed methods of relaxation to exercise, including deep breathing and muscle relaxation. Third, the patient is exposed to each of the anxiety-provoking stimuli, from the least to the most severe one, while exercising relaxation. Relaxation helps inhibit the anxiety, which in turn distinguishes the sense of fear.
      • Flooding. An employment of extinction, flooding is simply executed by introducing the anxiety-provoking stimulus to the subject in step-by-step trials, from the least to the most severe one, with no actual unpleasant experience occurred.  The exposure can be either realistic (Kimble and Randall, 1953; Polin, 1959; as cited in Rachman, 1965) or through images (Rachman, 1965).

    2.  OPERANT CONDITIONING

    Operant conditioning, or instrumental conditioning, is defined as a learning process where a voluntary behavior is shaped by the association of that behavior with a consequence.

    2.1. Basic Concepts

    • Types of consequences. 
      • Positive reinforcement: the act of rewarding a behavior by presenting a pleasant stimulus to encourage that behavior to be repeated. The typical reinforcing stimulus in animal studies such as Skinner’s (1938) is food, and in realistic situations is compliments or material gifts.
      • Negative reinforcement: the act of rewarding a behavior by removing a discomfort that the individual has had to endure, to encourage that behavior to be repeated. In Skinner’s studies (1938), the rats were electrically shocked after a light signal in the cage and as they accidentally pressed the lever jumping around, the electric current was turned off. This reinforced the rats’ response of pressing the lever to avoid the shock every time they saw the light signal. The switch of electric current is an example of negative reinforcement.
      • Punishment: the act of presenting an unpleasant stimulus after a behavior is done to discourage the behavior from being repeated. Punishment is common in reality – we all experienced it in childhood when we were caught with bad behaviors. Skinner (1950) considered it to be less effective than the other types of consequence, and advised to use it only intermittently.
    • Shaping. In operant conditioning, shaping particularly refers to the process of helping an individual form a desired behavior through step-by-step leading and reinforcements. A striking example of this is Skinner’s experiment with pigeons (1938). In this case, Skinner wanted to train the pigeons to peck a certain spot in the cage. Obviously the researcher would not wait for the pigeon to initiate the behavior on its own, which may take forever to occur, but had to create a calculated procedure to guide the pigeon toward the exact behavior. On the first step, he administered food to the pigeon whenever it turned slightly toward the chosen spot, to encourage the bird to turn in that direction again and again. After a while, the reinforcement was withheld, until the pigeon made a slight movement toward the spot. On this second step, food was administered only when the the pigeon moved its head closer to the spot. As it moved closer and closer, its beak would eventually touch the spot, and from then on, reinforcement was made only when the bird stroke its beak at the spot.
    • Extinction: the process when the association between learned behavior and given consequence weakens, until eventually the subject stops responding to the consequence. Extinction typically occurs after a long time the behavior does not meet the expected consequence.

    2.2. Schedules of Reinforcement

    Schedules of reinforcement refers to the planning of reinforcing stimuli in specific order and timings for operant conditioning. Following are some of the various schedules discussed in Skinner (1950) and Ferster and Skinner (1957).

      • Continuous reinforcement. Reinforcement is administered every time the subject makes a desired response.
      • Fixed ratio. Reinforcement is administered every time the subject completes a fixed number of responses, counted from the last reinforcement. The timing and frequency of the reinforcing stimuli vary in accordance to the timing and frequency of the subject’s response. 
      • Variable ratio. Reinforcement also depends on the number of responses completed; however, this number varies from time to time and is randomized from a series of values.
      • Fixed interval. Reinforcement is administered to the first desired response made after a certain interval of time.
      • Variable interval. Reinforcement also depends on the first response made after an interval of time; however, the length of this interval varies from time to time and is randomized from a series of values.
      • Alternative. Reinforcement is scheduled both in accordance to ratio (the required number of responses) and interval of time, depending on which is fulfilled first.
      • Conjunctive. Reinforcement is administered only after both a required ratio and an interval of time are met.
      • Interlocking. Reinforcement depends on the number of responses completed and this number varies from time to time. However, it is not randomized but accorded to the response rate. For instance, if the subject responds every quickly, they will have to complete a large number of responses to be reinforced; but if they have no response for a long interval, their first response after that will be reinforced right away.
      • Tandem. Reinforcement is administered only after both an required ratio and an interval of time are met, but in a fixed order. Giving a required ratio of 10 responses and an interval of 10 minutes, for instance, conjunctive reinforcement is administered every time both these requirements are met no matter which occurred first, while tandem reinforcement is done only when the ratio is completed first and then the interval passes.
      • Chained. Reinforcement depends on both ratio and interval of time, but the reinforcing stimulus changes after each of the required component.
      • Adjusting. The value of the interval or ratio is modified systematically after the reinforcement, depending on the most recent performance of the subject.
      • Multiple. The schedule consists numerous scheduling types which occur in random order, and each of which is associated with a certain, different reinforcing stimulus.
      • Mixed. The schedule consists numerous scheduling types which occur in random order, similarly to multiple reinforcement, but the reinforcing stimulus is randomized.
      • Interpolated. A short schedule is inserted into another, longer schedule. For example, a set of fixed ratio of 2 reinforcements for 20 responses is placed within a four-hour fixed interval.

    While being the most simple paradigm, continuous schedule is found to be less effective than other schedule as it leads to slow response rates and quick extinction. The pros and cons of all the mentioned reinforcement schedules in accordance to research findings would be elaborated in a coming entry of PsychPics.

    2.3. Applications

    • Behavior modification. Behavior modification is the generic term for the application of operant conditioning concepts – reinforcement, punishment, shaping, and extinction – into altering behaviors or forming new habits. Behavior modification can be designed in multitudinous ways, based on the variety of choices of schedules and techniques one may use as well as combine. It can  be applied in numerous settings, from formal educational and clinical programs to casual personal training and domestic management.

    Behavior modification in mental health settings is referred to as contingency management. Due to the complexity and severity of its target behaviors, the practice of contingency management is more stringent and systematic compared to behavior modification in other contexts.

    A similarity between behavior modification and classical-conditioning-based therapies is that they work on forming outward responses without intervening in the person’s thoughts – which is in line with the principles of behaviorism. These interventions are grouped as Analysis Behavioral Analysis (ABA).

    • Cognitive behavior therapy (CBT) is the common term for psychotherapeutic techniques which work on practical problem-solving by – different from the ABA – changing both behaviors and patterns of thinking underlying the problem. The CBTs combine the operant conditioning methods of behaviorism and introspective methods of cognitive psychology, creating the connection between the two branches of approach in psychology.

    3. SOCIAL LEARNING THEORY

    Social learning theory establishes the third learning process – learning though observing and imitating others. As formulated by Bandura (1969, 1971), social learning is completed by four interlaced sub-processes:

    • Attention. The learner attends to and recognizes the significant details of the model’s behavior. This process is greatly influenced by how much the learner relates themselves to the model, how functional the value of the observed behavior is manifested, and how attractively the model presents themselves.
    • Retention. The learned  retains the observed behavior in their memory. The memory does not record the entire event, but condenses it into essential elements – significant imaginal and verbal patterns – to preserve only.
    • Motoric reproduction. When the time comes to repeat what is learned, the learner does not recall the exact observed behavior. Instead, patterns of the behavior are brought upon their mind, directing their actions.
    • Reinforcement and motivation.  This process strengthens the likelihood that a learned behavior is reproduced in overt performance.

    The application of social learning theory in behavioral intervention is called social modeling. A traditional featured method in education, it is now increasingly applied in treating mental disorders and criminal behaviors as well. In recent decades, social modeling has become an integral part of computer science and engineering, providing fundamental principles for the building of software and information systems which require ergonomic knowledge and techniques.

    Figure 2 summarizes the methods and applications of behaviorism and their connections with other fields of psychology.

    Applications

    Figure 2. Methods and applications of behaviorism

     


    References:

    Bandura, A. (1969). Social learning theory of identificatory processes. In D. A. Goslin (Ed.), Handbook of Socialization Theory and Research (pp.213-262). Chicago, IL: Rand McNally.

    Bandura, A. (1971). Social Learning Theory. New York, NY: General Learning Press.

    Ferster, C. B., & Skinner, B. F. (1957). Schedules of Reinforcement. Englewood Cliffs, NJ: Prentice-Hall, Inc.

    Higgins, S. T. & Petry, N. M. (1999). Contingency management. Incentives for sobriety. Alcohol Research and Health, 23 (2). 122 – 127. Retrieved from http://pubs.niaaa.nih.gov/publications on January 11, 2016.

    Kellogg, S. H., Stitzer, M. L., Petry, N. M., & Kreek, M. J. (2007). Contingency management: Foundation and principles.  Unpublished manuscript. Retrieved from http://nattc.org on January 2, 2016.

    Pavlov, I.P. (1927). Conditioned Reflexes: An Investigation of the Physiological Activity of the Cerebral Cortex (G. V. Anrep, Trans.). London, UK: Oxford University.

    Rachman, S. (1965). Studies in desensitization – II: Flooding. Behaviour Research and Therapy, 4(1). 1-6.

    Skinner, B. F. (1938). The Behavior of organisms: An experimental analysis. New York, NY: Appleton-Century.

    Skinner, B. F. (1950). Are theories of learning necessary? Psychological Review, 57. 193-216. Retrieved from http://psychclassics.yorku.ca/Skinner/Theories/

    Skinner, B. F. (1953). Science and Human Behavior. New York, NY: Macmillan.

    Skinner, B.F. (1974). About Behaviorism. New York: Alfred A. Knopf

    Watson, J. B. (1913). Psychology as the behaviorist views it. Psychological Review, 20, 158–177.

    Watson, J. B., & Rayner, R. (1920). Conditioned emotional reactions. Journal of Experimental Psychology, 3(1). 1–14.

    Watson, J.B. (1925). Behaviorism. New York, NY:  Kegan PaulTrenchTrubner & Co.

    Wolpe, J. (1958). Psychotherapy by Reciprocal Inhibition. Stanford, California: Stanford University Press.

    Wolpe, J. (1961). The systematic desensitization treatment of neuroses. Journal of Nervous Mentality. new. ment. Dis. 132, 189-203.

    Yu, E. S. (2009). Social modeling and i*. Unpublished manuscript. Retrieved from http://www.cs.toronto.edu/ on January 5, 2016.

    Ecological System Theory (Bronfenbrenner, 1994)

     

    The Ecological System Theory (Bronfenbrenner, 1994) conceives the ecological environment in which an individual lives as a set of five systems, each of which nested inside another, with the Microsystem being the innermost and the Chronosystem the outermost.Eco System Theory.png

    1. MICROSYSTEM – The Immediate Environment
    Consists of activities, social roles and interpersonal relations that the individual directly experience in a given setting, such as family, school, peer group and workplace.

    2. MESOSYSTEM – The System of Microsystems
    Comprises the connections between two or more settings that surround the individual, such as the relations between home and school, between home and workplace…

    3. EXOSYSTEM
    Comprises the connections between the settings that surround the individual (home, school, peer group…) and external settings which indirectly influences the individual’s life (parents’ workplaces, the neighborhood…).

    4. MACROSYSTEM
    Encompasses comprehensive patterns of the microsystems, mesosystems, and exosystems of the environment in which the individual lives, such as national regime, belief system, knowledge, infrastructure, lifestyles, hazards…

    5. CHRONOSYSTEM
    Includes the changes or consistency over time in the characteristics of the individual and of the environment in which they live, such as changes in family structure, socioeconomic status, employment, place of residence…

     

    On proposing the theory, Bronfenbrenner emphasized the importance of studying human development through the movements of its surrounding environments and their interactions, citing research evidence for the influences of each of the systems on individual lives. One given example was Elder’s study “Children in the Great Depression”, which reported that young people experiencing the decline of their home economy during their adolescence – a facet of the Chronosystem – seemed to develop a more robust sense of achievement as well as career orientation compared to those who experiencing it as small children (Elder, 1974, as cited in Bronfenbrenner, 1994). Another mention was Epstein’s findings in 1983, which suggested that the closer collaboration between teachers and parents in elementary school – part of the Mesosystem – contributed to the higher levels of autonomy and achievement that the students would obtain in high school (Estein, 1983a, 1983b, as cited in Bronfenbrenner, 1994). And beyond scientific realm, we all can recognize the impacts of the ecological systems in real life by relating with our own experiences as well as with our observations of others.

    In consideration of heredity, Bronfenbrenner proposed an extension of the Ecological System Theory, termed as a “bioecological model”. The model acknowledges the role of heritability, but adds that heritability itself is under the influence of the environment, arguing that the process of heritability varies in a magnitude of potentials, and which level of potential its manifestations would reach depends on the quality of environmental conditions surrounding it. Despite its attempt to combine biology and environment in explaining human development, the model fails to provide in-depth insights about biological factors, while relying largely on the conceptualization of ecological contexts. Because of this limitation, Bronfenbrenner’s Ecological System Theory is regarded as just another complementary, rather than a comprehensive, perspective in the area of developmental psychology (Shaffer & Kipp, 2014).

     


    Bronfenbrenner, U. (1994). Ecological models of human development. In Husen T. & Postlethwaite, T. N. (Eds.), International Encyclopedia of Education (2rd ed., Vol. 3, pp. 3-44). Oxford, UK: Elsevier.

    Shaffer & Kipp (2014). Developmental Psychology (8th ed.). Belmont, CA: Wadsworth Cengage Learning

    Behaviorism – 1. Historical Progress

     

    A number of discoveries and theoretical establishments have contributed to the progress of behaviorism from its dawn to nowadays. The following are the best recognized milestones.

     

    1. PAVLOV’S REFLEX STUDIES & CLASSICAL CONDITIONING (Pavlov, 1890s)

    Pavlov’s experiments with dogs are very well-known in the science history. They formed a foundation for the concept of CONDITIONING, which is now a significant method in behavioral research and applied areas which adopt behaviorism.

    The discovery occurred in the course of Pavlov’s examinations about dog’s reflex actions, such as drooling at food. Pavlov noticed that there were times the dog drooled just as he entered the room, even without any food presented.

    Pavlov-Dog-Experiment

    Image 1. A simplified visualization of Pavlov’s conditioning experiment using the sound of                        the tuning fork as the CS.                                                                                                                     a) Food was fed, to familiarize the dog with it.                                                                                  b) Food was presented, to ensure its sight evoked the UR (salivation).                                              c) Food was presented to the dog, while the tuning fork was struck. UR occurred.                              This combined acts were repeated several times.                                                                            d) The tuning fork was struck without the food presented. UR occurred.                                             Conditioning was achieved.                                                                                                              e) After several repetitions of (d), the dog no longer salivated at the sound.                                          Extinction was achieved.

    To investigate the phenomenon, Pavlov brought food to the dog while at the same time presenting a factor which the dog did not particularly respond at (eg. a light, the sound of a tuning fork…). After several repetitions of this combination, at one point Pavlov presented only the additional factor, and the dog drooled at it as well. This process was defined by Pavlov as conditioning. The food was called an unconditional stimulus (US), the salivation at the food an unconditional response (UR), the additional factor an conditional/conditioned stimulus (CS), the dog’s original attitude at the CS an neutral response (NR), and the salivation at the mere presentation of the CS a conditioned response (CR).

    The basic procedure of Pavlov’s conditioning can be formulated as followed:

    (1)       US               ———->    UR
    (2)       CS + US     ———->    UR         (× times)
    (3)       CS               ———->    CR          (condition achieved)

    It was noticed that, at a later point following repetitions of Step (3),  the dog stopped drooling at the sound, resuming its NR at the CS. This phenomenon is termed extinction. 

    Pavlov’s conditioning process is now called classical conditioning,  as distinguished from the operant conditioning coined and demonstrated by Skinner (see I.5).

     

    2. PUZZLE BOX (Thorndike, 1898)

    Puzzle Box was an instrument designed by Thorndike for studying learning in animals, namely cats and dogs. The box was built as a cage with a simple locking mechanism (see Image 2).

    Thorndike-puzzle-box

    Image 2. A simplified visualization of the Puzzle Box based on Thorndike’s model (1898).

    Thorndike-puzzle-box-with-cat

       Image 3. A simplified visualization of Thorndike’s Puzzle Box Study on a Cat.                                                                                a) A fish was used to tempt the cat into escape.                                                                                  b) The cat pushed the bar, unlocking the gate on its side.                                                                  c) The cat pulled the string, unlocking the gate on its top.                                                                d) The cat pushed down the gate and escaped the box.

    The test animal was to be locked in the box, while a tempting piece of food was presented to it beyond its reach. While trying to get out of the box for food, the animal would accidentally remove the locks. After every escape, the animal was awarded the food and then placed into the cage again for another attempt. According to Thorndike’s report (1898), the amount of time it took for the animal to escape gradually decreased and the movements the animal made tended to be more focused over attempts, suggesting that the animal actually learned how to get out of the cage by and by.

     

    3. LAW OF EXERCISE & LAW OF EFFECT (Thorndike, 1905; Thorndike, 1912)

    Drawing from his studies of animal learning, Thorndike introduced the two laws for habit formation in the learning process, Law of Exercise and Law of Effect.

    In simple words, the Law of Exercise states that, the more repetitively or strongly a behavior is connected to a situation, the more likely that behavior is to be performed in response to the same situation in the future. Rather common sense to us, this law refers to practice in learning. Take the animal’s improvement in unlocking the Puzzle Box by Thorndike (1898) over many attempts as an example: the animal was familiarized with movements that linked more consistently to the escape, and concentrated on such movements in later attempts, which led to its being quicker and quicker in escaping.

    The Law of Effect, meanwhile, refers to consequences: the more satisfying a consequence associated with a behavior is, the more likely that behavior is to be performed again; likewise, the more dissatisfying a consequence associated with a behavior is, the more unlikely that behavior is to be performed again. Briefly speaking, pleasant consequences to a behavior encourage it, while unpleasant ones discourage it. This law was reflected in Thorndike’s studies with cats (1898) as in how the animals stopped making ineffective movements for escape, such as trying to squeeze themselves through the gap between the bars, after these movements left them frustrated. As emphasized by Thorndike (1912), Law of Effect is significant in education as a fundamental principle for building desired habits and eliminating undesired behaviors.

    Little-Albert.png

           Image 4. A simplified visualization of Watson and Rayner’s Little Albert Experiment                                    a) The rat was presented to Little Albert.                                                                                            b) As the baby moved to touch the rat, the loud noise was made.                                                      c) The baby jumped and hid his face into the mattress.                                                                      d) The repeated combination of the rat and noise eventually made Albert cry.                                e) At one point, the baby cried at the mere sight of the rat.

     

    4. LITTLE ALBERT (Watson & Rayner, 1920)

    Watson and Rayner’s s experiments with Little Albert was one of the earliest demonstrations of classical conditioning. The experiment further explored the effect of associating a neutral factor with an response-evoking stimulus. While Pavlov’s conditioning dealt with a reaction of desire (the dog’s salivation), Watson and Rayner’s looked into a reaction of fear.

    Little Albert was the alias that researchers used to address the participant in this experiment, an 11-month-old boy at the hospital where the experiment laboratory located. Prior to the study, when Little Albert was 9 months old, the experiment administered emotional tests to him to see whether he would react in fear when confronted certain stimuli, including a white rat, a rabbit, a dog, a monkey, hairy and non-hairy masks, cotton wool, burning newspaper, and a loud noise. It was reported by his mother and hospital attendants from their casual observations that the boy had never displayed fear or anger or cried. Test results confirmed that the boy show no reaction of fear against the presented stimuli – except a loud noise created by striking a hammer against a steel bar. On hearing the sound for the first time, Little Albert startled and raised his arms, and on the third time, he burst into crying.

    As Little Albert reached 11 months old, the experiment was started, with a white rat as the CS and the loud noise as the US.

    First of all, the rat was shown to Little Albert and he moved his hand to touch it. At that moment, the hammer was struck against the bar. Little Albert’s response was jumping up and hiding his face into the mattress he was seated on.

    One week after the first trial, the rat was shown to Little Albert again. The baby withdrew his hand from the rat as it touched him. The combination of rat and noise was presented again and again, following certain intervals. After every trial the baby’s fear response intensified a little – falling to another side, whimpering, and eventually, crying. At the end of the experiment with the rat, Little Albert was seen to cry just at the mere sight of the rat.

    The experiment continued until Little Albert reached almost two, revealing that he appeared to also develop a fear for things which resemble the rat: a rabbit, a dog, a fur coat, a cotton wool and a Santa Claus mask. Response intensity varied between these stimuli, but the general reaction was trying to avoid them, which the boy had never displayed in the emotional tests. Non-similar objects such as toy blocks or newspaper, whereas, did not provoke Little Albert at all. The boy touched and played with them normally.

    Little Albert was discharged from the hospital before an extinction process could be executed to his conditioned fear. Because of this, further examination on Little Albert’s case was not completed and the study remains controversial till this day for its ethical issues.

     

    5. OPERANT CONDITIONING (Skinner, 1938)

    In 1938, B. F. Skinner introduced the concept of operant conditioning (sometimes referred to as instrumental conditioning). In his book The Behavior of Organism: An Experimental Analysis, he detailed the distinctions between Pavlovian-type conditioning – now classical conditioning – and operant conditioning. Basically, they can be distinguished by the type of behavior they manipulate, the reinforcing stimuli they employ, and the law of conditioning and law of extinction they follow.

    Skinner-Box

       Image 5. A simplified visualization of Skinner Box Study with rats.

    Firstly, Pavlovian-type conditioning works with respondent behaviors, which are elicited by prior, specific stimuli and ruled by static laws. For instance, the dog’s salivation at the sight of food is driven by the natural instinct of eating. The behaviors operant conditioning is concerned with (termed by Skinner as operantbehaviors), whereas, appear spontaneous and not fixed to any known stimuli.

    The second difference between two types of conditioning – the reinforcing stimuli – refers to the events associated with the occurrence of the conditioned behavior. In Pavlovian-type conditioning, reinforcing stimuli take place prior to the behavior. In operant conditioning, reinforcing stimuli is introduced following the behavior.

    Thirdly, classical conditioning increases the strength of the reflex underlying the behavior, causing the behavior to be produced in response to other stimuli beside the reinforcing one, while operant conditioning increases the strength of the behavior itself, causing it to be repeated for the reward of the reinforcing stimuli.

    The theory and application of operant conditioning has been expanded throughout a history of behavioral research, including many studies by Skinner himself

    6. SKINNER BOX (Skinner, 1938)

    The operant conditioning chamber, also known as Skinner Box, was a device designed by Skinner for testing the learning capacity of rats, as part of his research in operant conditioning. It had an overall form of a cage (much like the Thorndike’s Puzzle Box) with an integrated mechanism which supplied the caged animal with food when they managed to perform a certain behavior.

    In a study, a rat placed in the Skinner Box (see Image 5), after running around to find a way to escape the box, would accidentally push a level fixed on a wall. The level pushed down would operate the food mechanism set outside the box, which subsequently delivered food pellets through a hole into a small tray inside the box. After repeated occurrences of food delivering, the rat gradually formed the habit of lever pushing. This habit formation was the result of operant conditioning with the use of food as a reinforcing stimulus.

     

    7. BOBO DOLL EXPERIMENT (Bandura, 1965)

    The 1965 study by Albert Bandura shed light to another typical behavior, which is also significant in learning, among animals and human alike: imitation. The hypothesis was that one would imitate a behavior better if they were shown or given positive reinforcement for that behavior.

    Bandura-bobo-doll-experiment.png

              Image 6. A simplified visualization of Bobo Doll Experiment

    66 kindergarten children were selected as participants for the experiment. Randomly divided into three separate rooms, they all watched a clip which showed an adult punching violently a Bobo doll. At the end of the clip in Room 1, the adult was praised to be a “strong champion” and given sweets and drinks by another adult. At the end of the clip in Room 2, the adult was reprimanded and spanked by another adult for being “a bully”. The clip in Room 3 ended without no consequence to the adult.

    Each of the children were then led into a playroom where a range of toys were displayed, including a Bobo doll, balls, cars, dolls, plastic animals, and so on. They were instructed to play with the toys as they pleased, and then left alone in the room. Observed by the experimenters through a one-way mirror, the child’s activities were assessed in respect of how much they matched the adult’s behaviors in the clip.

    After that, the experimenter entered the room again, bringing attractive-looking drinks and sticker books as incentives, and told the children that they would be given a sticker and drink treat every time they successfully imitate one of the adult’s behaviors in the clip. The child’s responses was also observed and assessed.

    Calculated results revealed that Group 1 and Group 3, in both the incentive-session and the non-incentive playroom, performed significantly more behaviors which matched the adult’s in the clip, than children in Group 2. This suggested that the consequence to the model was influential to the watchers’ imitation: those who saw the model receive awards or no consequence seemed to imitate them more than those who saw them receive punishment. The fact that the levels of imitation recorded were higher in the incentive sessions than in the non-incentive playroom also fitted the prediction that rewards would encouraged imitation.

     

    8. SOCIAL LEARNING THEORY (Bandura, 1969; Bandura, 1971)

    Along with classical conditioning and operant conditioning, social learning theory makes an important theoretical underpinning in behaviorism. Generalized from studies which examined modeling and imitation, including Bandura’s Bobo Doll experiment, the theory postulates that learning can take place in social contexts in which the learner picks up a modeled behavior by attentively observing, remembering and reproducing it. Unlike conditioning, social learning is not resulted from the association between the behavior and reinforcing stimuli, but rather from the guidance of symbolic representations which the learners detect and absorb from the model. An example is how some participants from the Bobo Doll experiment imitated the violent behaviors of the model adult even before they were encouraged to do so with incentives, or more realistically, how teenagers emulate celebrity idols they see on television.

    Although specific reinforcing stimuli are not essential for the occurrence of social learning, such stimuli – namely, rewards and motivations – can help boost the learned behavior into overt expression while discouraging factors inhibit it. A teenager who enjoys copying the styles of celebrity idols may restrain such behaviors at home in the presence of her disapproving parents, but will boldly exhibit them when she is among like-minded peers.

     


    References: 

    Bandura, A. (1965). Influences of models’ reinforcement contingencies on the acquisition of imitative responses. Journal of Personality and Social Psychology, 1 (6), 589 – 595.
    Bandura, A. (1969). Social learning theory of identificatory processes. In D. A. Goslin (Ed.), Handbook of Socialization Theory and Research (pp.213-262). Chicago, IL: Rand McNally.
    Bandura, A. (1971). Social Learning Theory. New York, NY: General Learning Press.
    Pavlov, I.P. (1927). Conditioned Reflexes: An Investigation of the Physiological Activity of the Cerebral Cortex (G. V. Anrep, Trans.). London, UK: Oxford University.
    Skinner, B. F. (1938). The Behavior of organisms: An experimental analysis. New York, NY: Appleton-Century.
    Thorndike, E. L. (1898). Animal intelligence: An experimental study of the associative processes in animals. Psychological Monographs: General and Applied, 2 (4), i-109.
    Thorndike, E. L. (1905). The Elements of Psychology. New York, NY: A. G. Seiler.
    Thorndike, E. L. (1912). Education, a First Book. New York, NY: The MacMillan Company.
    Watson, J. B., & Rayner, R. (1920). Conditioned emotional reactions. Journal of Experimental Psychology, 3(1), pp. 1–14.