Article summaries on Understanding Psychopathology 20/21

Summaries on Psychopathology, it gives an insight in research that tries to unravel the mechanisms behind Psychopathology. This set of articles is based on the 2020-2021 course 'Understanding Psychopathology' at Groningen university.

Topics that will be discussed: mental illness, mental disorders, anxiety, depression, panic (disorder), psychotherapy, mental cognition, social psychology, phobias, addiction

Image

Check summaries and supporting content in full:
Article summary with A complex systems approach to the study of change in psychotherapy by Hayes & Andrews - 2020

Article summary with A complex systems approach to the study of change in psychotherapy by Hayes & Andrews - 2020

What is this article about?

There are a lot of therapies with the goal to treat mental disorders. However, there is not much known about how these treatments work, so there is not much known about the processes. This is bad, because understanding which factors facilitate and inhibit therapeutic change can guide researchers in improving treatments, relapse, and recurrence. The complex system approach is used to study change across different physical and natural systems, ranging from cells and neurons to political and economic systems. There have been attempts to use the complex systems science in psychology and psychiatry, but this is a slow process. One reason for this is the randomized controlled trial (RCT) design. RCT is important for evaluating treatment efficacy, but these treatments are often focused at one component of functioning, such as cognitions, emotions, behavior, or physiology, instead of at the multi-component pattern, as in the complex systems approach. Another barrier to the uptake of the complex systems approach is that different subdisciplines in psychology and psychotherapy use different jargon and concepts. This makes it difficult to detect common themes and principles. In this article, the authors advance the complex systems approach by presenting some basic principles in a way that is true to complexity science, and which is accessible to researchers and clinicians. The goal is to provide an integrative framework which helps to translate concepts into a common language and to provide a structure for conceptualizing and studying different treatments and clinical problems.

What are the general principles in complex adaptive systems?

Pattern formation and attractors

A dynamic system is defined as a set of interconnected elements (connected to each other) which evolve over time and self-organize into higher-order functional units, called attractor states. Self-organization is defined as the process by which lower-order processes interact and higher-order patterns emerge and influence lower-order processes in a top-down way. These attractor states predict behavior. When the states are changing, the attractor states ‘attract’ the behavior back to this. Attractor states that are well-established have strong interconnected elements, with reinforcing and inhibiting feedback loops. These feedback loops can decrease or increase the probability of activation over time and contexts. These attractor states do not change easily: there need to be significant disturbance before the patterns are broken. However, less developed attractors are more easily changed.

System change: tipping points and nonlinear transitions

Complex systems can adapt to defend against challenges. There can be deterministic (purposeful, causal) and stochastic (naturally occurring random events, fluctuations, noise) forces which can affect the complex systems. The chance of transitioning from one attractor to another depends on the strength of that attractor, the type of perturbation, the parameters that control system organization, and the strength of alternate attractors. Change can be gradual, but when the parameters reach a ‘tipping point’, the dominant state can shift suddenly. This type of change is often abrupt, with periods of turbulence, in which attractors destabilize and create the potential for phase or order transitions. During these transitions, systems can reorganize into new patterns of functioning, and this may lead one’s health to change from healthy to diseased. It can be important to know when these transitions will occur. For example, early warning signs of symptom exacerbation or transition to disease can inform medici about treatment decisions and can even be a matter of life and death. Therefore, scientists have identified early warning signs that occur often. Two of these early warning signs are critical fluctuations, and critical slowing. Transitions can involve movement from healthy to maladaptive states, but also movement in an opposite direction. For example, critical slowing precedes recovery with interventions.

Patterns are weak, until they are strengthened and stabilized through repeated activation across contexts and by feedback loops. There can also be a period of vacillation or ‘flickering’ between attractors. When a new attractor strengthens, it can compete with a pre-existing attractor to prevent a return to that attractor. For example, when someone is trying to create a healthy habit, their previous (unhealthy) habits are strong. One needs to repeatedly engage in this new healthy habit, until this habit is sufficiently strong, and consolidated and maintained in memory. Processes in complex systems can also operate on different timescales: some move slowly, and others move quickly.

Application to psychotherapy: network destabilization and transition model

The authors developed ‘the network destabilization and transition model’ (NDT) as a framework that includes concepts and principles from complexity sciences (dynamical systems theory, synergetics, and network science) and uses this for psychotherapy research. This model was initially meant to be used in the treatment of depression, but it can be used for psychotherapy in general. The goal of the authors is to stimulate new research and to provide a framework for understanding and organizing findings from different research.

The goal of psychotherapy is to promote new learning, so to move a person from entrenched patterns of psychopathology to more flexible and functional patterns of functioning. Some researchers describe psychopathology as an attractor state with interacting elements between cognitions, emotions, behavior, and physiology. The goal of therapy is to change these patterns and processes. Tschacher and Haken also suggest to consider contextual factors, the therapeutic relationship, and environmental and random factor (stochastic), which can all influence the change process.

There are different routes in which therapeutic change can happen. It can be that more than two patterns are relevant to psychopathology. And, it is not clear whether pathological and healthy states are different networks, or rather parts of a single, large network.

Change in psychotherapy can thus happen in different forms. First, there can be minor adjustments made to maladaptive patterns. For example, harm reduction strategies reduce some negative consequences of pathological patterns. However, they do not lead to complete abolishment of attractor states. For example, think of providing clean needles to drug addicts. Other therapies are distress tolerance, mindfulness, and positive emotion activation approaches. These can change the threshold of activation and automaticity of both pathological and more functional patterns. For example, behavioral, interpersonal, cognitive reappraisal, emotion regulation, or parenting skills can all be used to: reduce feedback loops that block new information and interfere with new learning, deactivate or unhook from the pathological patterns of the attractor, and/or compensate for, or override, new patterns. Also, these strategies can be used to decrease exposure to stochastic factors, or to reduce the influence of these factors. Thus, all of these strategies work within the pathological attractor, but they do not change the attractor directly!

Another type of change refers to ‘switching’ from a pathological to a healthy attractor. To achieve this, there should be an alternative available. For example, the therapist can provide a supportive environment and therapeutic alliance, so to increase the patient’s readiness, resources, and skills to develop healthy behaviors. For example, Beck’s recovery-oriented cognitive therapy for schizophrenia. In this intervention, patients learn to switch from a ‘patient mode’ to an ‘adaptive mode’. Thus, they need to switch from a disorder-focused mode to a mode in which they create positive beliefs, aspirations, strengths, and values. Then, this adaptive mode is constantly activated and exercised to increase its accessibility and strength. Also, the positive emotion activation approaches can help to build more healthy attractors. Further, another type of change involves destabilizing the pathological attractor and developing a new, healthier attractor. This can be achieved by exposure therapy, insight-oriented therapy, and emotion-focused and cognitive restructuring techniques.

What are some important considerations?

The complex systems approach as described is important for psychotherapy research. It emphasizes the need for longitudinal data, the study of discontinuous and nonlinear change, and a focus on patterns of functioning rather than single components.

Data collection considerations

When conducting research with time-series, it is important to select the appropriate time interval, namely the time interval (sampling rate) that is most sensitive for detecting changes in the variables of interest. For example, some variables change slowly, while others change more quickly.

Breadth and duration of assessment

When researchers use microanalytic assessments, such as assessments on a timescale of minutes, the sampling rate is high, but the disadvantage is that the number of variables and the duration of assessments is low. Ideally, researchers should measure pathological patterns and symptoms over the course of therapy, between sessions, and after therapy to capture therapeutic change. Researchers could also gather passive data (activity level, exercise, sleep, social media usage).

Level of analysis

Complex systems is an individual-level approach, and psychotherapy is in contrast often a nomothetic level of group averages. However, it is important to use individual-level data, because findings from one level might not directly generalize to another level. This individual-level data allows studying the dynamics of a given person., which can help in treatments. However, it is important for science to detect patterns and principles that generalize across all people. These problems can be bridged, by identifying common indices of early warning signals across systems and sciences. Ellison and colleagues used ecological momentary assessments methods and Group Iterative Multiple Model Estimation (GIMME) and show how these levels can be combined.

What are further important considerations?

Different trajectories of change

When using the complex systems approach, trajectories of symptom change are important. In psychotherapy, a general assumption is that change is gradual and linear. However, time course data has shown that this is not always the case, and that the process can follow a nonlinear course. Changes in therapy can follow a quadratic pattern (U-shape), a cubic pattern, a saw-toothed pattern, and other nonlinear patterns. Think of relapse in addiction treatment.

Early warning signals

Increased variability and turbulence in psychotherapy is seen as potential for change according to the complex systems approach. There have not been many studies that have looked at early warning signals in psychotherapy. There are different ways to measure early warning signals. For example, using Grid-Ware, recurrence quantification analysis, or dynamic complexity. Calculating dynamic complexity can be done through R.  An important avenue for future research is to examine which early warning signals predict transition across clinical problems and treatments, and which represent therapeutic change.

Patterns and feedback loops

Thus, as noted, attractors exist of interconnected elements. Thus, patterns are the focus of study, and not a single component. Again, recurrence quantification analysis and related tools (Grid-Ware), the Synergetic Navigation System can be used to study multi-component patterns for individuals. One can also use network analysis tools, which quantify the structure, density, connectivity, and threshold of activation of patterns, and how they change over time. This network analysis depicts and measures patterns of psychopathology for a certain sample, and also personalized for a specific individual. This may guide treatment decisions and selection. However, a limitation of network analysis is that it assumes ‘stationarity’, which means that each variable, over time, demonstrates similar means, variances, and relationships with other variables and with itself.

Interplay of pathological and new patterns of learning

The complex systems approach thus suggests that new attractors can develop, and can build in strength so that they can compete with or inhibit old attractors. Modern cognitive development and learning theories also suggest that psychotherapy promotes new learning, through establishment of new patterns. However, there has not been much research conducted into how this old-new attractor competition takes place.

What can be concluded?

Thus, the NDT model for therapeutic change is a conceptual framework. The goal is to identify and translate concepts from subdisciplines of complexity science to psychotherapy research. In turn, these concepts and methods can help to increase the effectiveness of therapeutic treatments. Also, having a common organizational structure may benefit science in general.

Access: 
Public
Article summary with Retrieving and Modifying Traumatic Memories: Recent Research Relevant to Three Controversies by Engelhard a.o. - 2019

Article summary with Retrieving and Modifying Traumatic Memories: Recent Research Relevant to Three Controversies by Engelhard a.o. - 2019

Summary with the article: Retrieving and Modifying Traumatic Memories: Recent Research Relevant to Three Controversies - Engelhard, McNally & van Schie - 2019.

What is this article about?

In this article, the author reviews recent research which is relevant to three controversies in memory for trauma. The author presents an interpretation of recovered memories, which do not rely on the concepts of repression, or false memory. Second, the author talks about the idea that trauma memories often lack narrative structure, and that this can lead to the development of posttraumatic stress disorder (PTSD). Lastly, the authors discuss research of eye-movement desensitization and reprocessing (EMDR) therapy, which aims to improve PTSD.

What are the controversies concerning memories of trauma?

A nonrepression account of recovered memories

The repression perspective states that people, as they become older, are unable to recall memories of childhood sexual abuse, because this memory is too emotional. However, the author states that studies within this perspective, unjustly, mix memory phenomena with one another. For instance, they interpreted normal forgetfulness as ‘an inability to recall trauma’. They also unjustly mixed organic amnesia with psychic repression. Thus, their claims with regard to repression are often false, and not scientifically underpinned. Following the false-memory perspective, people who report memories of childhood sexual abuse are reporting false memories, especially when these memories are reported during recovered-memory therapy. Of course, not all memories are false, even though there are a lot of instances of false memories of trauma. It could be the case that adults experienced childhood sexual abuse, but that they did not experience terror that is associated with trauma. Thus, for a long time, they did not understand that their experience was actually wrong. Later, they understood what happened to them, and some of them develop PTSD after this understanding.

Are traumatic memories fragmented and incoherent?

According to some researchers, memories of trauma are often fragmented, incomplete, and lack narrative coherence. This is often the case for individuals with PTSD. Researchers state that patients have to emotionally process their traumas, and then be able to create a coherent narrative, before they can recover. In one study, Rubin and colleagues (2016) examined 60 trauma-exposed adults, of whom half had PTSD. They matched these adults based on their trauma type (combat, childhood sexual abuse, accidents). The participants were instructed to recount three traumatic, very positive, and three very important memories. They recorded the narratives, and transcribed these. They then looked at how coherent these narratives were. They found that most trauma memories were coherent, and that participants with PTSD did not have less coherent memories compared to participants who had experienced trauma but did not meet the criteria for PTSD. Thus, these findings indicate that the idea that trauma memories are characterized by a lack of narrative coherence, is not true.

Bedard-Gilligan and colleagues also tested whether traumatic memories have to be integrated and coherent, before people can recover from the traumas. They used a sample of PTSD patients who had received exposure therapy or sertraline. From each of these patients, they obtained a trauma narrative, a positive narrative, and a negative, nontrauma related narrative. Then, they evaluated the fragmentation of these memories before and after treatment. The results showed that memory fragmentation was not related to change in therapy. Thus, even if people recovered, this fragmentation did not change. Instead, it shows a person’s style of recounting autobiographical memories. Thus, trauma-related memories are not fragmented compared to non-trauma related memories, but they do cause patients to suffer. Erasing memories is not the ideal, because remembering danger can be important. Instead, it would be best if someone remembered the memories, but did not feel the negative emotions associated with it.

Are eye movements in EMDR therapy effective?

In EMDR, patients recall a traumatic memory, while visually tracking the fingers of the therapist as they move back and forth in front of the patient’s eyes. A long ongoing debate about EMDR is how eye movements help to decrease traumatic memories’ effect. A recent meta-analysis has shown that lateral eye movements enhance the effectiveness of exposure therapy. But, how does this work? There does not seem to be a placebo effect. It is suggested that lateral eye movements limit working memory resources which are essential for memory retrieval. When people recall a memory, distraction can interfere with retrieval. This helps to reduce imagination inflation. Studies have tested the working memory theory. They showed that other dual tasks that compete with memory retrieval, such as vertical eye movements, counting backward, attentional breathing, and playing the computer game Tetris are also effective. However, passive dual tasks such as listening to beeps or finger tapping, are not effective. Eye movements also do not seem to work when they are slow, and when they are combined with a different memory than the one in the pre- or post-tests. Also, the eye-movement therapy is more effective for visual memories than for auditory memories. Lastly, the eye-movement therapy does not only work against distressing memories, but also for imagined future threats, positive memories, and substance-related imagery. All with all, the exact underlying mechanism of this treatment remains unknown.

Access: 
Public
Article summary with an Network Analysis Transform Psychopathology? by McNally - 2016 - Exclusive
Article summary with Advancing understanding of executive function impairments and psychopathology: bridging the gap between clinical and cognitive approaches by Snyder a.o. - 2015

Article summary with Advancing understanding of executive function impairments and psychopathology: bridging the gap between clinical and cognitive approaches by Snyder a.o. - 2015

Introduction

Executive function (EF) helps navigate most daily activities and constitutes a set of cognitive processes allowing self-regulation/self-directed behaviour toward a goal. ER impairments are associated with most forms of psychopathology. Poor EF predicts rumination, and poor use of emotion regulation – risk factors for psychopathologies.

There’s a lot of parallel play between clinical and cognitive approaches to EF, potentially leading to failures to apply theoretical and methodological advances in one field to the other, hindering progress.

Three main goals:

  1. Review the current state of knowledge of EF impairments associated with psychopathology and limitations to previous research in light of recent advances in understanding/measuring EF. EF impairments seem to be transdiagnostically related to psychopathology, but limitations of prior research make existing evidence hard to interpret. So specific nature and patterns of impairments are unclear.
  2. Offer concrete suggestions for improving assessment of EF, based on conceptual and methodological issues in current research, to advance clinical science. Advocate for better EF assessment. Obtain purer measures, select and analyze tasks minimizing noisiness of EF data.
  3. Suggest future directions in EF research and clinical psychological science. Including integrating modern models of EF with hierarchical models of dimensional psychopathology as well as translational implications of EF-informed research on clinical science.

EF Impairments Associated with Psychopathology: Current State of Knowledge

EF is best described consisting of separable but related cognitive processes including unique and shared individual differences, genetic influences, and neural substrates. Aspects of EF that have been heavily studied in clinical psychology include shifting, inhibition, updating, working memory manipulation, verbal fluency, and planning – many of these can be further subdivided.

Previous research on EF impairments associated with psychopathology reviewed in this section has mainly used cross-sectional designs in adult samples, and assessed EF with traditional neuropsychological tasks. But there are limitations in the literature – imposing constraints on the state of knowledge and what can be determined through meta-analysis:

  1. Many neuropsychological EF measures tap multiple aspects of EF as well as non-EF abilities. These tasks help screen for severe deficits, but are too broad to answer questions about specific aspects of EF and potential underlying mechanisms.
  2. Because they were developed to detect more severe deficits, many traditional neuropsychological tasks may lack sensitivity to detect subtler deficits.
  3. These limitations carryover into meta-analyses. In many meta-analyses on EF, tasks are grouped into the processes they are commonly considered to tap. But these categories may be lumping together tasks that may be tapping different and/or multiple processes.

Despite these limitations, meta-analytic evidence indicates that EF deficits are pervasive across disorders and EF tasks.

Impairments on More Specific EF Components: Inhibition, Shifting, Updating, and Working Memory

Definitions:

  • Inhibition (I): switching between task sets or response rules.
  • Shifting (S): suppressing/resisting an automatic response to make a less automatic but taskrelevant response.
  • Updating (U): monitoring and coding incoming information for taskrelevance, and replacing no longer relevant information with newer, more relevant information.
  • Working memory (WM): actively maintaining and manipulating information across a short delay.

Impairments across disorders:

  • Schizophrenia: largest EF deficits found. Large effect sizes (ES) on measures of S, I, U, visuospatial WM, and verbal manipulation. Medium ES for simple verbal WM maintenance.
  • Mood Disorders: same impairments as in schizophrenia, smaller magnitude of deficits.
  • Major Depression (MDD): impaired with similar ES to schizophrenia.
  • Bipolar Disorders (BD) larger impairments than MDD, but also uniformly impaired across EF domains. Medium ES for S, I, visuospatial WM, and verbal WM manipulation. Small significant ES for verbal WM maintenance.
  • Obsessive Compulsive Disorder (OCD): impairments across most domains. Small significant ES for S, I, visuospatial WM, and verbal WM manipulation. Large ES for U. Simple WM unimpaired. Depression often cooccurs with OCD, but deficits in OCD aren’t driven by co-occurring depression.
  • Posttraumatic Stress Disorder (PTSD): compared to traumaexposed people not developing PTSD, people with PTSD had worse performance on shifting (medium ES) and visuospatial WM (small ES). PTSD patients do experience inhibition deficits. Unlike OCD, with PTSD, cooccurring depression may account for deficits in PTSD patients.
  • Anxiety Disorders: little EF research, mixed findings. Research in nonclinical samples suggest trait anxiety to be associated with impairments in specific aspects of EF, inhibiting competing responses. Little WM research as well. Lastly, evidence of poor EF contributing to attentional bias toward threat in anxious people, involved in anxiety maintenance.
  • Attention Deficit Hyperactivity Disorder (ADHD): impairments in S, I, visuospatial WM, and verbal WM manipulation (smallmedium ES) in children and adults. Verbal WM maintenance less impaired. Updating not widely studied. EF also impaired in other externalizing disorders, like oppositional defiant disorder and conduct disorder, but these deficits could be partly accounted for by co-occurring ADHD.
  • Substance Use: S, I, and WM impairments across most substance use disorders – generally medium ES. Reviews on EF impairments in substance use are difficult to interpret: 1) inclusion of polysubstance users makes effect of individual drugs difficult to isolate. 2) given the neurotoxic effects of alcohol and other drugs, it isn’t clear to what extent these deficits are a cause or consequence of substance use.

Complex Tasks: Verbal Fluency and Planning

Complex tasks may tap multiple aspects of EF – problematic if goal is to understand which specific processes are impaired. Nonetheless, these tasks still used in clinical studies of EF.

Deficits in verbal fluency are widespread across disorders. Meta-analyses show largest deficit for adults with schizophrenia and depression to be semantic verbal fluency. Semantic verbal fluency also impaired in people with BD, OCD, and ADHD, but inconsistent evidence for verbal fluency in PTSD patients. Little research on verbal fluency in anxiety disorder, mixed results.

Verbal fluency tasks impose multiple EF demands. One possibility of why semantic verbal fluency is more impaired in schizophrenic, BD, and depression patients is that it may place heavier demands on shifting. Another possibility is that semantic memory retrieval deficits could contribute to semantic verbal fluency impairment, especially in schizophrenia. Conversely, larger effect for phonemic verbal fluency in ADHD patients could be due to deficits in phonological processing, since ADHD and reading disabilities frequently co-occur. Deficits in verbal fluency could come from various sources –  difficult to interpret results from complex tasks.

Planning is less studied. Depression and BD patients have significant impairments in planning. Meta-analyses found mixed results for planning tasks in people with ADHD and OCD. There’s inconsistent evidence for planning deficits in PTSD. In theory, planning tasks tap multiple EF aspects, but standard measures of planning may be less sensitive than other tasks in detecting subtler deficits in some disorders.

Summary of Previous Findings

  • Evidence shows deficits on various EF tasks to be associated with numerous psychopathologies. Most disorders are associated with fairly uniform deficits EF tasks, but there are variations in effects.  Results seem consistent with broad, transdiagnostic, impairment in EF. Exception is verbal WM maintenance – smaller deficits. Findings support that Wm deficits in these disorders are because of impairment in the central executive of working memory rather than content-specific maintenance systems – consistent with view that there are broad EF impairments associated with psychopathology, rather than in specific aspects of EF.

Limitations of Previous Research and Suggestions for Future Research

EF is hard to study, define, and measure. Now will be outlined the limitations of how EF has been defined, conceptualized, and measured in previous research, and concrete suggestions will be presented to address these limitations.      

Conceptual Issues: Models of EF

Many previous clinical studies of EF have treated it as either unitary, or a list of separate, specific abilities. Seeing it as unitary over-lumps diverse tasks into a single construct, seeing it as diverse over-splits, treating a list of tasks as if they were assessing separate abilities rather than a common set of component processes supporting completion of more complex tasks.

The best evidence indicates individual difference in EF to show unity and diversity. Different components of EF correlate with each other, tapping some common underlying ability (unity), while showing separability (diversity). General structure of common and specific elements is shared by different models of EF, focusing on different components and levels of analysis.

  • Behavioural level of analysis – different models have focused on partly overlapping sets of EF components – e.g. Baddeley’s’ central executive system containing subsystems, others proposing a twofactor EF model.
  • Neural level – models have proposed that distinct, interconnected prefrontal regions support functions like setting tasks goals, initiating responses etc.

Though these models differ, they have points of convergence, often agreeing on core cognitive and neural mechanisms involved in EF.

Unity/diversity model – captures several features of what’s believed to be the key components of EF, practical to use for understanding EF at the behavioural level, and has potential to shed light on commonalities and differences in impairments across populations by differentiating common and specific components of EF. Focuses on three aspects of EF: updating WM, shifting, and inhibition. This unity/diversity pattern has been consistently found over other samples. Each EF ability can be decomposed into what is common across all three (unity (common EF)), and what’s unique to each ability (diversity).

The unity/diversity model suggests decomposing task performance into common and specific abilities that could better map the underlying cognitive processes. New approach that has produced significant findings:

  1. No unique variance left for inhibition after accounting for common EF – individual differences in EF fully account for individual differences in inhibition.
  2. Common EF and shifting-specific components sometimes show opposing patterns of correlations with other measures, possible trade-offs between stability and flexibility. Specific deficits in stability or flexibility will only be apparent when performance on shifting tasks is decomposed into common EF and shifting-specific factors.
  3. Different components of EF identified by this model differentially predict individual differences in clinically important behaviours. Evidence shows common EF as the primary source of this predictive power. Similarity of effect sizes on EF domains in disorders suggest that psychopathology more may be more broadly associated with impairment in common EF, suggesting that decomposing it may not have important implications for understanding EF deficits associated with psychopathology.

Methodological Issues

Multiple Measures

Biggest problem in measuring EF -> task-impurity problem. All tasks necessarily include systematic variance attributable to non-EF processes associated with that task context, making it difficult to cleanly measure the variance of interest. Since most clinical EF studies have used a single task to assess EF processes of interest, results are nearly always a mixture of non-, common-, and specific-EF component effects, making interpretation difficult.

This problem can be alleviated by using multiple measures of each component under investigation. If the chosen tasks share little systematic non-EF variance, one can see what’s common across tasks and use the resulting ‘purer’ variable as a measure of EF. E.g. measures of each EF component should be used then aggregated to measure common EF.

Simplest way to combine data from measures is to calculate a z-mean across tasks. Advantage – z-mean across tasks instead of individual tasks, variance in the scores not related to the constructs of interest no longer drive the effects. Disadvantage – merely combines scores, error variance is still there and can be a source of reduced power. So, if the sample is large enough, it’s preferred to use latent variable approaches for extracting the variance shared across tasks while removing error variance, - e.g. factor analysis, structural equation modeling.

Task Selection

It’s also important to carefully pick tasks. Many clinical studies now use traditional neuropsychological measures tapping multiple aspects of (non-)EF abilities. Useful for screening for severe deficits but too broad to answer questions about specific aspects of EF that may be implicated in psychopathology. Complex neuropsychological tests tap a variety of cognitive processes, making interpretation difficult. This can be addressed by using tasks designed to specifically place demands on individual aspects. Important to include specific tasks to identify what processes account for impairment on broad neuropsychological tasks.

Many studies also use questionnaires or self-report measures. These correlate poorly with task-based measures of EF, and shouldn’t be assumed to be measuring the same constructs. Questionnaire-based measures have ecological validity as they as about real-world situations. But they pose interpretational problems because of the multiple executive and non-executive functions involved in real world settings/contextual influence. Specific questions about EF are best addressed using targeted tasks.

Sensitivity and reliability of tasks are important. Tasks should be sensitive to the magnitude of deficits expected form the sample being tested. Tasks with low reliability have poor correlations with other measures. Reliability is sample specific. Unfortunately, complex EF tasks tend to have low internal/test-retest reliability due to different strategies used when completing them.

Problems with sensitivity and reliability are problematic because they could lead to false negatives resulting in not being published, or being published with the conclusion that EF isn’t impaired in the clinical group.

There are a number of commercially available task batteries including tasks assessing EF.

  • Advantages – often they have more extensive psychometric evaluations and norms, their standardization allows for clear comparison.
  • Disadvantages – they generally don’t provide comprehensive coverage of different components aligned with current models, and don’t provide multiple measures of each construct needed for latent variable approaches. Also less likely to yield new insights because they’ve been heavily used in most clinical populations.

Other Methodological Considerations

How the data is collected and analyzed is also important. When the total individual variance in EF task performance in broken into EF, task-specific and error components, the ‘noise’ of non-EF task-specific variance and error variance can be large, while the ‘signal’ of EF-specific variance may be small. So to detect the most important signal for the inquiry, it’s important to minimize error variance and maximize power. First, strong need to increase sample size to improve power (underpowered studies lead to file drawer problem and lack of replicability). Second, once data’s collected, reliability and validity of the measures depend on how they are screened and analyzed. Also important to screen for and address the presence of outliers.

Future Directions

Given the discussed limitations to previous research and the goal of understanding links between EF and psychopathology at a level of specificity that can support translational research, we propose two broad directions for future research.

Testing Models of Unity/Diversity Across Both EF and Psychopathology

First direction suggested is that the problem of understanding the undifferentiated nature of EF impairments across disorders may be made more tractable by testing models that include both unity and diversity in psychopathology and EF.

What gives rise to broad patterns of impairment in EF across disorders? These deficits can’t easily be explained by non-specific factors. In most cases effect sizes are similar across core EF domains. This pattern of broad impairment is consistent with the theory that people with multiple forms of psychopathology have impairments in the unitary component of EF, posited to be the ability to actively maintain task goals.

Latent variable models of psychopathology find that there’s a common factor spanning all aspects of common psychopathologies in addition to more specific aspects (the ‘p factor’). Transdiagnostic impairments in EF might be explained by a link between the p Factor and common EF. But, cognitive factors that appear as transdiagnostic at one level of analysis may not be when more detailed measures at multiple levels of analysis are considered.

Causal Models

Second direction suggested is that research needs to move beyond cross-sectional case-control designs to test different possible causal links between EF and psychopathology.

A general shortcoming of the broad field of cognitive risks in psychopathology across the lifespan is the frequent lack of consideration of possible models of how cognitive impairments and psychopathology may be causally related. It’s unknown if EF deficits (a) precede and are a potential causal risk factor for developing psychopathology, (b) follow, and are a consequence of psychopathology, or (c) are a correlate of psychopathology without playing a causal role.

Many studies assume a particular causal model, but there have been fewer attempts to try to rule out/in particular models based on evidence. Cross-sectional case-control studies aren’t able to differentiate between these possible models.

  • In sum, the causal links between EF and psychopathology haven’t been well established, and the mechanisms connecting them are unknown and in need of theoretical and empirical investigation. These questions have important implications for prevention and treatment.

Treatment Implications

Current evidence suggests that approaches aimed at teaching compensatory strategies may be the most promising direction for future translational research. There’s little evidence supporting direct training of EF (targeting the weakness instead of compensatory strategies), little evidence that it generalizes to real world functions or improves clinical symptoms. There’s still the possibility that types of training that better target areas of weakness might provide better transfer.

Treatment and prevention programs involving compensatory strategies may therefore be a more promising direction for translational research e.g. goal management techniques, - shown to improve functional outcomes in schizophrenic individuals.

In addition to this there may be a need to adapt and personalize current treatment approaches to match client’s EF abilities – tailoring treatment approaches through better understanding of their profile.

Executive function deficits also have important implications for psychopharmacological treatments. For behavioural therapies, pre-treatment EF has been shown to predict drug treatment response.  Better understanding of EF deficits can enhance targeting of medications that affect the neurotransmitter systems known to be involved in those EF processes.

Conclusion

Cognitive and clinical psychology have followed largely independent paths. It’s argued that it’s necessary to move past the ‘parallel play’ of these fields to push clinical psychological science toward a better understanding of how/why EF is broadly compromised across disorders.

It’s recommended to apply validated models of EF to clinical research using multiple tasks to obtain purer measures, and also select/analyze tasks that minimize noisiness of EF data.

To address the task impurity problem/improve reliability, we recommend carefully choosing EF components to focus on and using multiple measures of each component of interest and combining them to composite scores or latent variable analysis.

We advise using more specific EF measures rather than traditional, broad neuropsychological tests. It’s the hope that combining current theoretical and methodological advances of clinical and cognitive science can advance the field towards understanding underlying processes involved in EF impairments at a level that enables translational research to improve treatment.

Access: 
Public
Article summary with Transdiagnostic mechanisms of psychopathology in youth: Executive functions, dependent stress, and rumination by Snyder a.o. - 2019

Article summary with Transdiagnostic mechanisms of psychopathology in youth: Executive functions, dependent stress, and rumination by Snyder a.o. - 2019

Executive function (EF) processes help us respond to the environment and regulate our thoughts and behaviours towards our goals. Meta-analytic evidence shows that EF deficits are pervasive across psychopathologies and EF tasks, proposing that they may be transdiagnostic risk factors for psychopathology. Four key limitations in the majority of prior research hinder progress in testing this hypothesis.

  1. Most studies use standard neuropsychological test approaches that confound different aspects of EF, leading to seemingly undifferentiated nature of impairments across disorders.
  2. Vast majority of research has investigated individual disorders without taking the pattern across disorders into account – ignoring comorbidity between disorders.
  3. Most research has focused on adults, missing the key adolescent/emerging adult phase for EF development and psychopathology risk.
  4. Few studies have examined how (mediation) and for whom (moderation) EF and psychopathology are related.

This study aims to address these limitations by linkages between latent dimensions of EF and latent psychopathology dimensions in a sample of adolescents and emerging adults.

Unity/Diversity Model of EF

EF is best characterized as separable but related cognitive processes, with unique and shared differences, genetic influences, and neural substrates. The unity/diversity model focuses on three aspects of EF.

  1. Shifting, defined as switching between task sets or response rules (e.g. classification by shape or colour).
  2. Inhibition, defined as suppressing or resisting automatic responses in order to make less automatic but task-relevant responses (e.g. Stroop task).
  3. Updating, defined as monitoring for and adding task relevant information in working memory, and deleting irrelevant information.

These abilities correlate, suggesting there’s a common EF ability involved in all three aspects. Common EF is posited to be the ability to monitor for and maintain goal and context information and use that information to bias ongoing processing.

Each ability can be decomposed into what’s common across all three EF’s (unity) and what is unique to that particular ability (diversity). Two large, independent youth and adult samples support a model with common EF factor and updating – and shifting- specific factors’ common EF accounts for individual differences in inhibition. The different components of EF identified in this model differentially predict individual differences in clinically relevant behaviours, with recent evidence finding that the common EF is the primary predictor, relating to behavioural disinhibition, attentional problems, and transdiagnostic psychopathology. Meta-analytic evidence shows generally similar effect sizes across core EF domains, consistent with the theory that individuals with multiple forms of psychopathologies have impairments in the unitary component of EF.

Bifactor Models of Psychopathology and Links to EF

Psychopathology has also been shown to consist of both common and specific factors. As opposed to historical conceptualizations of disorders as categorical conditions they’re now conceptualized as continuous symptom dimensions.

There’s a long history of modeling internalizing and externalizing dimensions of psychopathology liability, recently expanded to include a common factor (p factor). P factor model has been replicated in multiple samples. Critical question to ask of such models is whether they’re related to theoretical and practically important risk factors and outcomes and can help advance clinical science (are they useful?). Significant evidence has validated p factor in relation to a wide variety of psychopathology risks and outcomes in developmental continuity.

These models thus hold promise for clarifying the many patterns between risk factors and categorically defined disorders. Recent conceptual models have proposed that executive dysfunction is a risk factor for common psychopathology. Several studies have found the p factor to be associated with poorer performance on EF tasks, including working memory and a single EF composite in children, working memory, flexibility, response inhibition, and updating tasks in adolescents, and working memory and shifting tasks in adults. P factor has also been linked to structure and function of prefrontal areas involved in EF in youth.

To note, these studies used manifest EF variables or single, unitary EF factors, and so didn’t directly test the hypothesis that common psychopathology liability is linked to poorer common EF. Only one study so far has tested links between the bifactor model of psychopathology and the unity/diversity model of EF. This study found that the common EF factor assessed at age 17 was predicted by p factor assessed across childhood and adolescence only for male participants and based on teacher, not parent, ratings. When modeled separately, internalizing related to poorer common EF across genders and raters, and externalizing related to better shifting-specific EF. So associations between the p factor and poor EF are consistently found across studies, but results have somewhat varied.

Mechanisms Linking EF and Psychopathology

There’s a lack of research investigating why EF is associated with psychopathology dimensions. Hankin et al (2016) posited that stress could mediate risk between EF and expression of common and internalizing psychopathology. Dependent stressors are negative life events partly influenced by a person’s behaviour (e.g. failing an exam). The stress generation model maintains that individual difference vulnerabilities related to psychopathology impair functioning and increase risk for dependent stressful life events. Poor EF has been proposed as a factor that can contribute to stress generation. Poor EF may contribute to functional impairments resulting in dependent stressors (e.g. failing an exam because of failure to plan). It was found that the link between EF task performance and internalizing symptoms increase with age due to increased association between EF and dependent stressors. Possibly because adult caretakers compensate for younger adolescents’ poor EF (e.g. reminders to complete homework), preventing poor EF from being translated into behaviours that lead to stressful life events.

Dependent stressful life events strongly predict rumination, a pattern of repetitive thought in response to an emotional state. Rumination is proposed to act as a transdiagnostic risk factor by amplifying current mood states and impairing problem solving. Predicts that rumination is associated with the p factor, but hasn’t been directly tested.

Mediation by stress generation and rumination could potentially explain why EF impairments are broadly associated with psychopathology. Study found that dependent stressful life events predicted internalizing symptoms both directly and via increased rumination – both factors transdiagnostic risks, stress predicts p factor.

Current Study

Recent research using latent dimensional models of psychopathology have begun to disentangle sources of the broad EF impairments observed across disorders. Research suggests that associations may be mainly driven by shared psychopathology liability (p factor). Most research on EF relating to p factor models hasn’t employed models that can differentiate between EF components. Lastly, mediating mechanisms of poor EF conferring risk for psychopathology remain speculative.

The current study tests associations between the unity/diversity model of EF and the latent dimensions of psychopathology liability in a community sample of youth during key adolescent to emergent adult period of enhanced psychopathology risk and continuing EF development.

This study is cross-sectional, but allows preliminary tests of potential mediating mechanisms which future longitudinal studies can investigate. Study aims to clarify potential risk pathways between EF impairments and forms of psychopathology liability, and accelerate progress in understanding how these impairments may contribute to co-occurrence across psychopathologies. This study extends previous research demonstrating that poor EF predicted anxiety and depression symptoms via stress generation and rumination, and that these effects were stronger in older youth.

They hypothesized that seemingly broad deficits on EF tasks associated with forms of psychopathology are best explained by poorer common EF associated with common psychopathology liability. Because of mixed and limited prior research, there’s no a priori hypotheses regarding externalizing or internalizing- specific liability factors and conduct exploratory analyses. If, as they predicted, the common EF was associated with the p factor, they hypothesized that this association would be partially mediated by the indirect path through dependent stressful life events and rumination, and that the EF-stress path would be stronger for older youth.

Method

Participants

292 participants aged 13-22. Selected to maximize racial and economic diversity.

Procedure

Youth participated in hour lab visits, with breaks. Participants gave written informed consent (18-22) or parental consent (13-17). Youth completed three EF tasks assessing updating, shifting, and inhibition. Participants asked to complete self-report questionnaires online before their visit. All studies approved by IRB.

Measures

EF Tasks

EF tasks from Friedman et al. (2016) except for stop signal task. The Friedman et al. tasks have been found to have good internal and test-retest reliability, and convergent validity including significant factor loadings on common EF and, for updating and shifting tasks, the updating- and shifting-specific factors. Stop signal task validated in reference to mathematical models and neural indices of response inhibition.

Updating: for all updating tasks, the performance measure is the proportion of correct responses across all trials. Tasks: Keep track, Letter memory, Spatial 2-back twelve

Shifting: for all tasks, participants first practiced a block of each sub-task separately, followed by mixed-task blocks. Performance measure for shifting tasks is the switch cost: difference in mean response time between correct tasks switch trials and task repeat trials in mixed blocks. Participants instructed to respond as quickly as possible without making mistakes, indicated by an error beep. Tasks: Number-Letter, Colour-shape, Category switch

Inhibition: Tasks: Antisaccade, Stop signal, Stroop

Questionnaires

  • Children’s Depression Inventory – CDI
  • Penn State Worry Questionnaire for Children – PSWQC
  • Manifest Anxiety Scale for Children – MASC
  • Child Behaviour Checklist Youth SelfReport – YSR
  • Multisite Multimodal Treatment Study of Children with ADHD – MTA
  • Strengths and Difficulties Questionnaire – SDQ
  • Adolescent Life Events Questionnaire Revised – ALEQR
  • Children’s Response Styles Questionnaire – CRSQ

Discussion

This study aimed to understand possible risk pathways between poorer EF and internalizing and common psychopathology (p factor) liability. For older youth, poorer common EF, was associated with higher internalizing liability in a one-factor internalizing model, and higher common psychopathology (p factor) in the bifactor model. Poorer common EF was also associated with more dependent stressful life events in older youth, which in turn associated with higher rumination for all youth. For all youth, rumination and dependent stressful life events were associated with higher internalizing liability in one-factor internalizing model, and with higher levels of the p factor and internalizing-specific factor in bifactor model. Overall results suggest that the many-to-many relations between different forms of psychopathology and measures of different aspects of EF may be more parsimoniously explained by a link between common EF and both common psychopathology liability (p factor) and internalizing-specific liability in older youth – suggesting that stress and rumination may serve as mediators of these relations.

Specificity to EF Dimensions

Links with psychopathology liability were specific to the common EF factor, rather than updating- or shifting-specific EF. Consistent with a previous study testing associations between the p factor and the unity/diversity model and with most previous studies with more specific symptom dimensions as well as broad impairments. But one study found that depression symptoms were associated with lower common EF cross-sectionally, but prospectively predicted lower updating-specific EF, suggesting that associations may differ in some cases.

Why may common EF factor be linked to psychopathology liability? It’s thought to capture the ability to actively maintain and manage goals and use them to control ongoing processing, a demand shared by all EF tasks. On the other hand, updating- and shifting-specific demands are likely to occur only intermittently in daily life. So poor common EF may be likely to impair functioning, including in ways that increase stress. This relation is likely transactional, with stress leading to EF impairments.

Age Moderation

Common EF in this study significantly interacted with age, such that poorer common EF only associated with dependent stressful life events in older youth. This may be due to the role adults play in compensating for younger adolescents’ poor EF, buffering against stress generation.

Though age moderation is predicted based on past findings, other studies found associations between EF and the p factor in younger individuals. However, they also used different methods of assessing psychopathology which could account for the differences between those studies and the current one.

Stress and Rumination Effects

Primary focus of this study was to better understand EF-psychopathology liability links, but also provided new insights into how stressful life events and rumination relate to the bifactor model of psychopathology liability.

  1. Dependent stressful life events were associated with all three psychopathology liability dimensions. Association with p factor is consistent with the model of stress as a transdiagnostic risk factor, but associations with internalizing- and externalizing-specific factors also suggest that stress further confers specific risk for internalizing and externalizing dimensions. Dependent stressful life events are a strong risk factor for depression and anxiety, so may be more strongly associated with the internalizing-specific factor than other types of stress.
  2. Rumination hasn’t been previously examined in relation to the bifactor model. This study found rumination to be strongly associated with the p factor. It was also associated with higher internalizing-specific liability, but much weaker association – supports the view that rumination is a broad transdiagnostic risk factor rather than specific to depression or internalizing psychopathology. However, this additional association suggests it may confer risk for anxiety and depression via both common and specific mechanisms.
  3. Youth who engage in higher rumination had lower externalizing-specific liability, after variance shared with internalizing was accounted for.

Limitations and Future Directions

There were several limitations in this study:

  • Study was crosssectional, precluding conclusions about the temporal ordering of variables in the models.
  • Tested a hypothesized process model based on previous research and theory, identifying possible mediating mechanisms as a promising start for future longitudinal studies.
  • More longitudinal research is needed with EF and psychopathology assessed at multiple time points to see if EF is a risk or consequence of common psychopathology, or both (transactional relations).
  • This study focused on psychopathology and EF in adolescence and emerging adulthood in a nonselected community sample. Future research is needed to determine if the model generalizes to other age groups, and to high-risk or clinical populations. More research is needed to test whether EF-stress generation links are specific to the late adolescence/emerging adulthood developmental period.

Translational Implications

More research is needed in order to translate these findings to practice. But better understanding the mechanisms by which EF may serve as a transdiagnostic risk factor for psychopathology has the potential to inform new targets for intervention. There’s a lot of interest in EF training as a potential prevention/treatment strategy, but until now there’s little evidence that training transfer to real-world functioning. Interventions aimed at disrupting the link between poor EF and stress generation could be a promising approach, rather than attempting to train EF.  This study found that the link between EF and dependent stressful life events was specific to common EF, suggesting that training in compensatory strategies for goal-management could potentially mitigate effects of poor common EF to reduce stress and potentially reduce psychopathology risk.

Conclusions

Broad patterns of poorer performance on EF tasks associated with forms of psychopathology symptoms may be best explained by associations between common EF and common psychopathology liability, though there are also internalizing-specific associations. Poorer common EF associated with internalizing and the pf factor via dependent stressful life vents and rumination in older youth. This developmental period of increased demands for independence may be a critical window for risk associated with poor EF. Interventions aimed at disrupting the link between poor goal-management and stress have potential for reducing dimensional psychopathology, particularly in emerging adults.

Access: 
Public
Article summary of Specificity of executive functioning and processing speed problems in common psychopathology by Nigg a.o. - 2017

Article summary of Specificity of executive functioning and processing speed problems in common psychopathology by Nigg a.o. - 2017

What is this article about?

Neuropsychological abilities are important for psychopathology. For example, they could serve as potential markers of genetic or other disease liability, or as components of pathophysiology. There is a growing movement at the NIMH that wants to improve the nosology (classification) of mental disorders, based on neuropsychological abilities, such as cognitive and emotional functioning.

However, this is hard. There is no one on one association between neuropsychological abilities and mental disorders. Thus, different psychiatric disorders can be characterized by similar neuropsychological deficiencies, such as problems in emotion regulation, problems in executive functioning, or problems in information processing. However, a lot of studies of psychopathology act like there is a one on one relationship between these phenotypes and individual disorders. The current article looks at what is the overall correlational structure of intermediate phenotypes within different mental disorders. Intermediate phenotype is used, because it is not known to what extent executive functioning or processing speed are endophenotypes, thus pathophysiological indicators. It is thus also not fully known how executive functioning and processing speed can serve as a liability indicator for different disorders. In this article, this will become more clear. The authors describe two important candidate intermediate phenotypes which are relevant to improve nosology of mental disorders, namely: executive functions (EF) and processing speed or response speed (Speed).

What is Executive Functioning?

Executive functioning is a very important term in general psychology. It refers to top-down (from the brain), goal-directed cognitive processing. There have been a lot of different, overlapping definitions. For example, some define it as a top-down process that is necessary for goal-directed action, while others define it as complex cognition. When there are novel situations, one is reliant on his or her executive functioning.

In relation to psychopathology, executive functioning determines a lot of capabilities, such as problem solving, impulse control, and emotion regulation. Thus, executive functioning processes have been suggested to be intermediate phenotypes, endophenotypes, or measures for attention-deficit hyperactivity disorder (ADHD), antisocial personality disorder, substance use disorder, depression, anxiety, schizophrenia, autism, and learning disorders.

Specific executive functioning processes differ across different models. In the current paper, the authors chose a few specific key abilities, namely: set-shifting and maintenance, interference control, response inhibition, and working memory. The authors also included response consistence/variability, which may be both a correlate of executive functioning, or ‘Speed’.

The fact that executive functioning includes a lot of different processes, complicates the discussion. Therefore, the authors look at it from a holistic perspective, and focus on a combined measurement of executive functioning. Executive functioning is further important, because there is a growing market of cognitive training which are aimed at improving certain components of executive functioning, such as improving working memory as a way to improve psychopathology. However, before such interventions can be set out, it is important to understand what executive functioning is, and what the relation is with psychopathology.

What is Output Speed?

 The term ‘speed’ is used to refer to different processes that define the speed of response on a task, or output speed. Output speed refers to speed of perception, efficiency of information processing and accumulation, speed of and bias toward response preparation, and speed of response execution. The final outcome of these processes is reflected in the speed of responding, and this will be called ‘speed’. It is important to distinguish this speed from executive functioning processes, and especially from efficiency.

Speed is not always theorized as a marker for psychopathology. However, it did gain some interest as an index of genetic liability and white matter integrity. In the literature, processing speed has been associated with intelligence (IQ), health, and psychopathology. Slow speed is widely used as an outcome measure in intervention studies for different health conditions. However, studies conducted on executive functioning often neglect processing speed. This is weird, because processing speed, which is a lower-order process, informs higher-order processes such as executive functioning.

Which psychopathologies are selected?

In the current article, the authors chose to focus on common, co-occurring disorders such as ADHD, antisocial personality disorder (ASPD), alcoholism, substance use disorder, depression, and anxiety disorder. The authors decided to focus on these disorders, because they often co-occur, and can be grouped into liability dimensions. Thus, they chose a sub-set of psychopathology disorders that are common and often overlapping.

What is the relationship between Speed, EF, and structures of psychopathology?

Weak executive functioning and slow Speed are often important in different disorders. However, EF dysfunction is not a necessary or sufficient cause of mental disorders. But, why is EF implicated in many disorders? There have been reviews that have compared EF across disorders, but the authors of these reviews agree that different studies use different methodologies, and that this makes interpretation difficult.

In the current study, the authors want to clarify what the combined relationship is between EF and Speed for common disorders. By doing this, they aim to understand how intermediate phenotypes can be integrated in nosology of mental disorders. One possibility could be that Speed sometimes accounts for the effects of EF, because EF measures are often confounded with Speed. The authors propose three hypotheses.

  1. The Specificity model. This model suggests that different disorders are associated with different types of executive functioning deficits. Thus, according to this model, EF should not be considered as a single construct. Instead, it should be considered as consisting of components. This is called the component structure. This approach may show that ADHD is associated with poor working memory, and that antisocial behavior may be related with poor response inhibition.
  2. The Severity model. According to this model, EF and Speed impairments are associated to the overall severity of psychopathology, instead of a specific form of psychopathology. Within disorders, severity varies in terms of impairment (overall functioning level), and other ways. The authors have used two proxies to measure severity. First, they looked at the number of co-occurring disorders. This index can be helpful in clarifying whether a cognitive function that is associated with a disorder is due to that disorder itself, or because the disorder is nested in a cluster of many disorders. Another approach is the clinician rating of Global Assessment of Functioning (GaF).
  3. The Dimension Model. According to this model, EF or Speed deficits are related to one or more shared, underlying psychopathology liability dimensions, instead of specific disorders. E.g., Krueger (1999) found that there are two broad, superordinate factors which account for the pattern of correlations among liabilities to common mental disorders. This, he called, an internalizing factor and an externalizing factor. Internalizing factors were related to depression and anxiety disorders, and externalizing factors were related to anti-social and substance use disorders. Internalizing factors refer to that people experience problems in their thoughts, emotions, and feelings, but they do not necessarily show this in their behavior. Externalizing means that someone shows disrupted behaviour. Other studies have suggested that there is a general psychopathology, ‘g’ or ‘p’ factor.

What methods did the authors use?

The authors combined two adult samples from overlapping local communities, and they were chosen because they had extensive clinical evaluations of comorbid disorders, were community-recruited and thus avoided the bias of clinic-referred samples, and contained a broad representation of the disorders of interest.

Sample 1: ADHD Adults and Controls

This sample was initially recruited for a study which evaluated the relationship between executive functioning of adults with ADHD. The participants were recruited from the community through public advertisements. During a face-to-face interview with a clinician, eligible participants were assessed. These clinicians used the Structured Clinical Interview for DSM-IV Axis I Disorders. Disorders that are common in childhood (ADHD, CD, and ODD) were assessed with the Kiddie Schedule for Affective Disorders and Schizophrenia. IQ was assessed using the short form of the WAIS-III. Participants that were in a current period of major depression/mania/hypomania, or participants that were not able to remain sober during testing, participants with IQ < 75, participants that were taking anti-psychotic, anti-depressant, or anti-convulsant medication were excluded. The final sample consisted of 363 adults.

Sample 2: Substance Abuse Study

This sample was initially recruited for the study of the etiology of substance use disorders from preschool years until the time of greatest substance use. The participants consisted of parents from families who had participated in the Michigan Longitudinal Study (Substance Abuse Study), which is an ongoing longitudinal study of the development of alcohol and other substance use disorders. These families were recruited from the community, based on the alcoholism status of the father during 1985-1993, when their target child was in preschool and the parents were aged 20-53. The data was collected at the initial recruitment (Wave 1), and at 3-year intervals after Wave 1. In Wave 5, participants underwent neuropsychological testing. There were 159 court alcoholic families. These families were recruited when the father was ever convicted of drunk driving, and had a high blood alcohol content, but was not undergoing litigation. Also, there were 91 control families, in which neither parent was associated with substance use disorder. These were recruited with door-to-door canvassing in the neighborhoods of the court alcoholic families. Lastly, this canvassing also lead to the discovery of community alcoholic families, which were 61 families. Community alcoholic families refer to families in which the father met the criteria for alcoholism, but was never arrested for drunk driving. This initial sample consisted of 607 men and women. The SMAST, Drinking and Drug History Questionnaire, the Antisocial Behavior Inventory, and the Diagnostic Interview Schedule Version III were completed. DSM-IV alcohol-related and ASPD diagnoses were also established by clinicians. IQ was assessed using the short form of the WAIS-R. The authors counted the total disorders, to achieve the index of severity. Participants were excluded if they met criteria for lifetime diagnosis of psychosis or bipolar disorder, or when their IQ < 75. This led to the exclusion of 2% of the participants. The final sample then consisted of 470 participants.

Neuropsychological Test Battery for Both Samples

The neuropsychological measures were the same for both samples.

Set Shifting

To measure set shifting, the authors used the ‘Trail Making Test’, which is a paper-and-pencil test that consists of two parts. In part A, participants have to connect numbered circles as fast as possible, without making errors. In part B, participants need to switch between numbers and letters in alphabetical-numerical order. Part A reflects measures of output speed, and part B reflects scanning and motor speed, and switching.

Interference Control

To measure interference control, the authors used the Stroop Color-Word Test. In this test, the participant reads words like “red” or “green”, but the ink of the letters varies. So, one can read “green” in red ink. This shows interference control: are they able to say ‘red’, even when the ink is green?

Set Maintenance and Working Memory

To measure higher level cognitive control, the authors administered the Wisconsin Card Sorting Test (WCST). Participants view a computer screen with four ‘key cards’ which they need to match to a stimulus card. This measure reflects rule detection, set maintenance, shifting, interference, and working memory updating, in short working memory.

Response Inhibition

The Go-Stop Task was administered to measure response inhibition. Participants view a computer screen, which displays an X or an O on a black and white screen. The participants need to respond to these stimuli, by pressing the buttons ‘X’ and ‘O’ as quickly as possible. When they hear a tone, they need to withhold their response.

Response Time (RT) variability

To measure RT variability, the authors computed as within-subject standard deviation on the ‘go-trials’ of the stop-go task. This measure reflects EF or arousal state.

What can be concluded?

In this article, the authors looked at the relationship between psychiatric disorders with EF and response speed. They found that psychopathologies can be conceptualized hierarchically. Different manifestations of psychopathology such as alcoholism or depression are nested within higher order factors. The authors found that processing speed is associated with higher order liabilities for psychopathology, and for its  severity. This was especially true for the externalizing domain and the domain of ADHD. In contrast, EF did not operate as a general factor, but on a componential level, in which specific cognitive weaknesses were related to specific manifestations of psychopathology. This suggests that there is a two-stage model of neuropsychology and psychopathology that may guide how we approach intermediate phenotypes. From this idea, it can be concluded that processing or response Speed is an intermediate phenotype for both externalizing psychopathology and overall severity of psychopathology. EF components are intermediate phenotypes for subsets of that liability. It may also be the case that the neurobiological basis of processing speed reflects delayed or immature white matter development. For instance, in individuals with ADHD, there have been notations of delayed cortical maturation, and diffuse white matter alterations. This is also the case for individuals with antisocial personality disorder, and substance abuse.

Access: 
Public
Article summary of The association between executive functioning and psychopathology: general or specific? by Bloemen a.o. - 2018

Article summary of The association between executive functioning and psychopathology: general or specific? by Bloemen a.o. - 2018

Background: we modeled both psychopathology and executive function (EF) as bi-factor models to study if EF impairments are transdiagnostic or relate to individual syndromes, and if such associations are with general or specific EF impairments.

Introduction

Problem domains of psychopathology are highly correlated, both concurrently and over time. Its proposed that the structure of psychopathology is best captured by a bi-factor model of general psychopathology on one side and specific problem domains on the other. Specific problem domains include internalizing (INT), externalizing (EXT), as well as ADHD and autism spectrum disorder (ASD) problem domains. The bi-factor model separates what is common across different psychopathology domains from what is unique for each domain.

EF is crucial for our daily functioning in guiding goal-directed behaviour. It includes a broad range of cognitive processes, e.g. suppressing automatic responses, switching between tasks, maintaining/updating information etc. Different theoretical models of EF share the idea that the structure of EF includes general and specific parts. EF’s have something in common, but are separable from one another.

EF impairments are widely accepted as a common characteristic of various psychiatric disorders. The focus has long been on finding distinct cognitive profiles for these disorders, which has produced inconsistent findings. Research hasn’t converged on which EFs are impaired in each disorder. Has been found that EF dysfunctions are generally widespread, severe, and pronounced in disorders that are severe and chronic e.g. schizophrenia. Conversely, no findings have been reported in younger populations with mild to moderately severe disorders like depression or anxiety. Caspi et al. (2013) has shown that poorer performance on EF tasks was associated with the general psychopathology factor (p factor) but not with specific INT and EXT factors.  Furthermore, EF seems more impaired when multiple conditions are present. together, findings suggest that EF impairments are more strongly associated with severity and chronicity of psychiatric problems than with distinct diagnoses – could explain discrepancies in research where focus has been on distinct disorders.

One could ask whether impairments in general or specific EF are related to psychopathology. Studies have previously only examined either EF performance on lumped or on separate neuropsychological tasks.  Studies using separate tasks have shown uniform impairments across tasks relating to psychopathology – may indicate that EF is not process specific when associated with psychopathology. This is also suggested by other studies that have lumped scores on EF tasks. Although these findings suggest an association at the general EF level, aggregation through sum or factor scores still captures the scores on individual tasks. Given this overlap, no firm conclusions can be made. In contrast, a bi-factor model splits the variance into general and specific parts. So bi-factor models allow for a clear-cut interpretation of the extent to which associations between psychopathology and EF can be summarized as generic or more separable.

To summarize, it’s generally agreed on that multiple EF impairments are involved in a range of disorders. But extensive research hasn’t agreed on distinct EF profiles for distinct disorders. This study investigates whether EF problems relate to severity/chronicity of psychopathology rather than type. It’s addressed whether general or specific EF impairments associate with psychopathology. This will be done by modeling psychopathology as measured over multiple occasions during the course of adolescence in a bi-factor model. Also, EF will be modeled as measured at two occasions during adolescence in a bi-factor model. Through this bi-factor approach, the study aims to understand the relationship between psychopathology and EF. Hypothesis: association is generic. EF impairments only associated with general psychopathology.

Method

Sample

2230 adolescents from the Tracking Adolescents’ Individual Lives Survey.

Measures

  • Child Behaviour Checklist: parent rated questionnaire.
  • Child Social Behaviour Questionnaire: captures six symptom dimensions typically seen in children with ASD.
  • Executive functioning: using 1) Baseline Speed Task, 2) Feature Identification Task, 3) Sustained Attentional Dots Task, 4) Memory Search Letters Task, 5) Shifting Attentional Set Task

Discussion

Study modeled both psychopathology and EF as bi-factor models to examine whether EF impairments are transdiagnostic or relate to individual syndromes – and if these associations are with general or specific EF impairments. With the double bi-factor approach, the best model showed that impairments in multiple specific EFs are associated with general psychopathology. Especially true for ADHD and ASD problems. Furthermore, INT problems have a distinct association with cognitive flexibility. We can draw stronger conclusions with regard to findings from the ‘psychopathology side’ than the ‘EF side’ of the double bi-factor model. So we conclude that inconsistent findings in literature may be due to substantial transdiagnostic EF impairments. Also once these transdiagnostic impairments are captured, ADHD, ASD, and INT problems still have their specific EF profile. Thirdly, whether general or specific EF relates to the p-factor needs to be studied more.

Caspi et al. research was extended by including ADHD and ASD problems. IN line with their findings, this study showed associations with the p factor. The p factor captures all the shared variance between different problem domains and is often considered to reflect severity and chronicity of psychopathology.

INT and EXT problems alone don’t show much impairment in EF beyond severity and chronicity captured by the p factor. Caspi et al. found no associations between cognitive flexibility and INT problem domain like this study, but this could be due to cognitive flexibility being measured with different tasks. Results suggest that cognitive inflexibility may be a core cognitive deficit of INT problems. Makes sense when thinking about the connection between cognitive flexibility and rumination – key component of INT disorders. People with impaired cognitive flexibility less able to ‘reset’ their minds after daily hassles and therefore fixate on their thoughts evoking feelings of worry.

The ASD and ADHD problem domains show associations with specific EFs above and beyond associations with severity and chronicity of psychopathology. Both show impairments in visuospatial working memory, sustained attention, feedback responsiveness, and cognitive flexibility – impairments in psychomotor speed and working memory maintenance were specific to ADHD. Although the adolescent sample represents absent-to-very mild severity in ASD, study demonstrates that EF impairments are present. moreover, although ASD often is considered a more severe neurodevelopmental disorder than ADHD, this study found that EF impairments are somewhat more widespread/stronger in ADHD.

One of the biggest strengths of this study is the use of neuropsychological data in an epidemiological context – making for fairly unique data. Moreover, the study included ADHD and AS, lifespan conditions not be ignored as specific problem domains when trying to understand the structure of psychopathology. Finally, EF was modeled with a bi-factor model. By partialling out shared variance between EF tasks, measurement error and state-specific variance were minimized. Specific EFs may in part still measure non-EF-related variance as the study was unable to use multiple tasks for each specific cognitive process. Nonetheless, the task impurity problem was tackled to some extent.

This study had several limitations:

  1. Consider the dropout rate of the sample – may not be fully representative of the general population.
  2. Had a limited set of EF measures that assess fairly specific EF dimensions, rather than more complex processes or higher order EFs which are more closely related to behaviour.
  3. Use of EF measures differs greatly between studies. Emphasizes need for future studies to determine if these conclusions hold in samples using different tasks.
  4. Though there isn’t literal overlap in items and constructs, the items do tend not to be fully specific for the construct they measure – limiting the specificity of the psychopathology domains.
  5. Use of parent-reported questionnaires may have different properties for INT and EXT issues.

By studying diverse psychopathology domains simultaneously, this study showed that EF problems cross diagnostic boundaries and play a domain-specific role in ADHD, ASD, and INT symptomatology. It’s concluded that the association between psychopathology and EF can’t simply be considered either generic or specific and examining both components of EF and psychopathology allows for clearer conclusions on distinct EF profiles for distinct disorders.

Access: 
Public
Article summary with Worse baseline executive functioning is associated with dropout and poorer response to trauma-focused treatment for veterans with PTSD and comorbid traumatic brain injury by Crocker a.o. - 2018

Article summary with Worse baseline executive functioning is associated with dropout and poorer response to trauma-focused treatment for veterans with PTSD and comorbid traumatic brain injury by Crocker a.o. - 2018

Introduction

Several studies indicate that posttraumatic stress disorder (PTSD) is more prevalent in Operation Enduring Freedom/Operation Iraqi Freedom (OEF/OIF) Veterans than the general U.S. population. PTSD leads to substantial personal societal costs – increased likelihood of unemployment, reduced work productivity, poorer physical health etc. Negative outcomes associated with PTSD also extend to increased rates of suicide attempts, homelessness, substance use, and domestic violence. Approximately 20% of OEF/OIF Veterans have history of traumatic brain injury (TBI), mostly in the mild range of severity. Mild TBI – concussion, blow to the head resulting in loss of consciousness less than 30 minutes, alteration of consciousness/posttraumatic amnesia lasting less than 24 hours.

Expected recovery time from mild TBI is a return to baseline functioning within three months with complete resolution of TBI-related sequelae. But a significant minority report post-concussive symptoms including cognitive difficulties – higher rates of poorer recovery in military personnel compared to civilians. Research indicates that comorbid psychiatric conditions play a significant role in the persistence of symptoms and cognitive deficits in those with history of mTBI. They have poorer functional outcomes, greater severity of mental health and post-concussive symptoms, and decreased quality of life than those with either condition alone.

Cognitive Behavioural Therapy (CBT) is shown to be one of the most effective PTSD treatments and can reduce post-concussive symptoms in those with a history of mild-to-moderate TBI. Despite efforts by the VA/DoD to make evidence-based treatments available to Veterans with PTSD, a substantial portion of people drop out of treatment prematurely, don’t respond to it, or relapse after completion. Treatment engagement is worse for OEF/OIF Veterans – less likely to begin treatment, attend fewer sessions, and have higher dropout rates than civilians/Veterans form other eras. History of mild TBI may exacerbate these effects.

It’s hypothesized that a likely barrier to treatment completion/effectiveness is the executive function problems present in people with PTSD and history of TBI. EFs are a set of higher-level cognitive abilities that organize and integrate lower-level cognitive processes to guide behaviour and perform complex, goal-directed tasks. Three interrelated but distinct core EFs: shifting/cognitive flexibility, updating working memory, and inhibition – together contributing to other EFs (planning, problem solving, reasoning). EF is important for success in everyday life. EF deficits are associated with maladaptive behaviours and negative outcomes including substance use, crime, violence, recklessness, worse physical health, and poorer treatment adherence.

Neuropsychological research indicates that PTSD and history of TBI have (separately and jointly) been associated with EF deficits, including impairments in inhibitory control, working memory, task shifting, and sustained attention. Theories propose a bidirectional relationship between PTSD and EF – EF deficits contributing to development/maintenance of PTSD symptoms, and PTSD symptoms exacerbating EF impairments in turn. Also evidence that comorbid history of TBI increases risk for and contributes to EF deficits.

EF impairments make it difficult to overcome logistic barriers contributing to non-compliance and drop out of PTSD treatment. Executive dysfunction can make it difficult to plan and problem solve to overcome these barriers making it hard to attend appointments and complete treatment. Further, as CBT relies on adequate EF, deficits could hinder effectiveness of PTSD treatments. Particularly a concern for CBT approaches that’re primarily cognitive in nature like CPT – which involves identifying/challenging maladaptive trauma-related thoughts to alter their impact on behaviour. EFs are essential in CPT to engage in cognitive skills involved in treatment.

Worse EF at baseline has been associated with poorer response to CBT in psychosis and schizophrenia, generalized anxiety disorder, and obsessive compulsive disorder. Research on baselines measures of cognitive functioning predicting treatment responses is limited. The two relevant published studies to date regarding EF indicate mixed findings. One study found that neuropsychological measures of EF obtained before treatment didn’t predict response to CBT. Conversely, another study utilized fMRI and showed that dysfunction in prefrontal and cingulate regions during an EF task prior to treatment predict nonresponse to CBT for PTSD. No studies until now have studied if baseline EF deficits predict dropout and poorer treatment outcomes in people with comorbid PTSD and history of TBI.

This study examined whether neuropsychological measures of EF obtained prior to treatment would be associated with dropout and response to CPT in Veterans with PTSD and history of TBI. Secondary analysis of a randomized control trial of standard CPT vs modified CPT including compensatory rehabilitation strategies was performed. Hypothesis: worse baseline EF would be associated with reduced CPT completion and responsivity.

Method

Participants

Consisted of 74 participants who were 1) OEF/OIF Veterans with PTSD, 2) had a history of mild to moderate TBI, 3) reported current subjective cognitive complaints, and 4) were stable on psychiatric medication for at least six weeks.

Procedure

Participants were assessed for inclusion/exclusion criteria and provided informed consent for participation. They were randomized to one of two 12-week treatment conditions: 1) standard CPT or 2) SMART-CPT, novel hybrid treatment integrated psychoeducation about TBI and compensatory cognitive rehabilitation strategies from CogSMART into CPT. SMART-CPT included cognitive strategies focused on attention, memory, and executive functioning modified CPT to include more concrete language, repetition of key points via written summaries and brief reviews, and simplified/restructured worksheets.

Baseline assessment of study variables included measures of demographics, TBI characteristics, psychiatric and post-concussive symptoms, quality of life, and cognitive functioning. Participants completed a symptom measure of PTSD weekly during treatment to assess ongoing symptom change in addition to completing it during the three assessment visits.

Measures

  • Symptom questionnaires
  • Neuropsychological measures
  • TBI characteristics
  • Statistical analyses

Discussion

As hypothesized, poorer performance on baseline measures of EF associated with dropout and reduced responsivity to trauma-focused treatment in Veterans with PTSD and a history of mild-to-moderate TBI. Those who dropped out of treatment prematurely performed more poorly on EF tests of novel problem solving and shifting/cognitive flexibility at baseline, though only novel problem solving remained a significant predictor of dropout when controlling for baseline symptom severity. Worse baseline performances on EF tests were associated with poorer response to CPT. Finally, baseline measures of memory, intellectual functioning, and education weren’t associated with dropout or treatment response – indicating effects were specific to EF.

Findings are consistent with theory suggesting that executive dysfunction in people with PTSD and history of TBI may be a barrier to treatment completion and responsivity, especially for CPT as intact EFs are needed for successful use of CBT strategies. Worse EF (reduced cognitive flexibility), may be especially problematic for therapy approaches that predominantly involve cognitive restructuring such as CPT because reappraisal techniques rely heavily on executive process in order to inhibit maladaptive thoughts and beliefs and flexibility generate and evaluate alternative, more realistic thoughts. Problems with EFs can also lead to dropout of treatment because of difficulties utilizing planning and problem solving skills to overcome various logistic barriers that make it difficult to attend therapy appointments.

Adaptive motivational processing relies on intact EF so goals can be selected based on their predicted outcomes, behaviours can be planned to achieve these goals, and goal-directed action can be maintained in the face of distraction. So EF impairments may contribute to difficulty maintaining motivation to persist in treatment, especially when it becomes emotionally challenging. Motivational dysfunction can also manifest in therapy-interfering behaviours like missing sessions, avoiding assignments, and engaging in maladaptive coping strategies when in distress. EF deficits could contribute to difficulties flexibly planning and implementing strategies to overcome challenges to trauma-focused treatment in order to persist in long-term goal achievement.

Results also fit with previous empirical studies demonstrating that worse EF at baseline associates with dropout from CBT for substance dependence and generalized anxiety disorder, as well as reduced response to CBT in several psychiatric disorders.

Little research has considered whether poorer neuropsychological functioning in any cognitive domain at baseline reduces response to PTSD treatment. Few studies examining these possible relationships focused mainly on memory. Nonsignificant findings for memory (discrepant with previous studies) paired with significant results for EF suggest that relationships between aspects of cognition and treatment may depend on the type of treatment approach used.

These findings have several clinical implications. Assessing pre-treatment cognitive functioning, using low cost neuropsychological measures may inform efforts to better match individuals with treatments they’re most likely to benefit from. Also altering current PTSD treatments to use methods that more directly target executive dysfunction could improve treatment adherence and boost its effectiveness. Adding cognitive training to existing treatments has also increased mental health treatment completion rates. Thus taken together, results suggest that directly targeting EF via cognitive training before treatments like CPT could strengthen executive networks and allow Veterans to fully engage in and benefit more from components of CPT. Finally, results indicate that those with reduced cognitive flexibility benefit from modifications to CPT, including adding compensatory strategies and altering CPT to include more concrete language, repetition of key points, written summaries, and simplified worksheets.

Strengths:

  • Use of multilevel modeling with 14 assessment time points – facilitating better modeling and estimation of treatment response.
  • Contribution of baseline neuropsychological functioning to treatment completion and response were considered jointly, which has rarely been done previously, despite high dropout and nonresponse rate to PTSD treatment.
  • Multiple EF measures included to examine different aspects of EF, given the importance of considering distinct EF’s separately.

Limitations:

  • Limitations of this study may make it difficult to generalize findings to other populations.
  • Sample was predominantly male veterans – unclear if results would be consistent for nonVeteran samples or female veterans.
  • Also unclear if the same relationships would be observed in individuals with only PTSD (no history of TBI). It’s possible that observed relationships are somewhat unique to those with both conditions as these individuals often have more severe symptoms, worse executive abilities, and poorer functioning, leading to negative impact on treatment. But it was found that TBI characteristics weren’t associated with dropout from CPT and other studies have shown that presence of TBI history didn’t predict treatment response (though these samples were small).

Future research would benefit from larger samples to further explore the role of comorbid TBI history and associated injury characteristics in treatment compared to PTSD alone. But despite these limitations, this study makes important contributions to our understanding of relationships between baseline EF abilities and PTSD treatment completion/responsivity.

Access: 
Public
Article summary with Safety behaviours preserve threat beliefs: Protection from extinction of human fear conditioning by an avoidance response by Lovibond a.o. - 2009

Article summary with Safety behaviours preserve threat beliefs: Protection from extinction of human fear conditioning by an avoidance response by Lovibond a.o. - 2009

There’s considerable evidence that access to within-situation safety behaviours can interfere with the beneficial effects of exposure therapy for anxiety. Similar outcome has been observed in panic disorder. The best explanation is the cognitive account which says that patients attribute the absence of the feared outcome to their safety behaviour and fail to update their threat beliefs. Exposure in the absence of safety behaviour is thought to disconfirm excessive threat beliefs and lead to long-term fear reduction.

A closely related phenomenon that’s been demonstrated in the conditioning laboratory is protection from extinction. In this procedure, a Pavlovian conditioned stimulus (CS) is established as a predictor of an unconditioned stimulus (US), and is then subjected to extinction by presenting the CS without the US. “Protection” refers to finding that extinction is impeded by the presentation of an inhibitory CS during the extinction phase. The best explanation for this effect is that the inhibitory stimulus cancels the expectancy of the US generated by the target excitatory stimulus, so that there’s no discrepancy between what’s expecting and what happens, no extinction.

The phenomenon of protection from extinction provides a potential lab model for investigating the role of safety information during exposure therapy. A study found that a conditioned inhibitor interfered with extinction of a target excitatory stimulus. But the primary sources of inhibition or safety in therapy are thought to be safety behaviours produced by patients themselves, as opposed to an external source.

This study uses a procedure recently developed to study instrumental avoidance learning. It builds on the Pavlovian fear conditioning procedure where different coloured shapes serve as CSs for shock or no shock. Sometimes participants have access to response buttons. Access is signaled by illumination of the buttons, but participants have to press the right button to avoid shock. Skin conductance and shock expectancy are recorded.

Method

Participants

65 undergraduate students, who volunteered.

Apparatus

Participants tested individually in a darkened room. CS’ were coloured squares and the US was an electric shock produced by a constant current generator.

Skin conductance was measured through electrodes attached to the second and third fingers of the participant’s non-preferred hand.

Procedure

Participants were told the experiment consisted of a number of trials with rest periods in between and that on each trial, a coloured square appeared for 5 seconds followed by a 10 second waiting period, followed by either a 0.5 second shock or no shock. They were told there was a relationship between the colour of the squares and the occurrence of the shock that they should try and work out. Participants were told that response buttons may light up, and that pressing a lit button may cancel a pending shock.  Participants were asked to use the expectancy pointer during the waiting period after the square disappeared to indicate their expectancy of whether a shock would occur or not.

Following this, the Pavlovian acquisition phase was designed to establish the CS-US contingencies prior to introducing opportunity for an avoidance response. Two CSs, A and C, were paired with electric shock (A+, B+), and a third CS presented without a shock (B-). No response buttons were illuminated during this phase.  In the Avoidance acquisition phase, participants were given the opportunity to make a button-press response during presentations of stimulus A. To simplify learning, only the correct response button was illuminated. If they pressed the button, the shock was cancelled.

In the Extinction phase, stimulus C was presented six times without shock. Protection group was given the chance to make the button-press response for this stimulus. The button was not illuminated for the control group. In the final (Test) phase, the impact of the extinction trials was examined by presenting C alone, without response opportunities.

Discussion

The experiment provided clear evidence for protection from extinction of a Pavlovian fear CS by an instrumental avoidance response. Control group showed normal extinction of stimulus C, whereas the Protection group, who made the avoidance response during the Extinction phase, showed little extinction. Adds to the evidence for protection from extinction in humans and extends it from external stimuli to an internal response.

Participants weren’t given a choice of responses, so the illumination of the correct response button was strongly correlated with the absence of the shock US. Was the critical factor in the protection effect light? No, the light was only correlated with the absence of the US as a result of performance of the instrumental button-press response.  

In general, the same pattern was observed in this experiment on both expectancy ratings and skin conductance. One point of divergence between the measures – when the avoidance response was made available in the Avoidance acquisition phase, it led to a modest reduction in shock expectancy ratings to stimulus A. Presumably due to prior instructions given that suggested pressing a response button may cancel shock. Opposite effect was observed on the skin conductance measure – making the avoidance response available led to an increase in skin conductance on the first A*(+) trial. This could be due to arousal associated with performing the avoidance response.

We interpret the protection from extinction effect as analogous to the role of within-situation safety behaviours in preserving threat beliefs and anxiety in patients undergoing exposure therapy. Shock expectancy ratings = threat beliefs, skin conductance = anxiety/fear. The results are consistent with the cognitive account proposed by Salkovskis et al. (1999). By this account, the Protection group attributed absence of the expected electric shock on C*= trials to the avoidance response. So when the response wasn’t available in the Test phase, they retained their high shock expectancy to C and showed strong anxiety. The Control group didn’t have an alternative explanation for the absence of the expected electric sock on C- trials, and were forced to revise their expectancy shock downwards – lowered their ratings and skin conductance responses in the Test phase.

It seems strange to account for conditioned responding in terms of a cognitive mechanisms as it’s been seen as a low-level, reflexive process. But recent evidence suggests that it may depend on high-level cognitive processes. This perspective is compatible with cognitive threat appraisal theories of anxiety. Expectancy of harmful outcomes may give a common explanatory mechanism cross fears derived from learning, observation, and instruction. Results imply that habituation of an innate fear response and extinction of an acquired fear response may both be based on a reduction in expectancy of threat.

Lastly, clinically, the present results support the strategy of minimizing safety behaviours during exposure therapy and giving participants clear cognitive rationale for doing so. The avoidance procedure provides a potential lab model to investigate the cognitive and other mechanisms underlying protection of beliefs against disconfirming evidence. This could be used to study potential positive benefits of safety manipulations, as well as interactions with other variables of therapeutic relevance like pharmacological agents.

Access: 
Public
Article summary with Exposure treatment in multiple contexts attenuates return of fear via renewal in high spider fearful individuals by Bandarian-Balooch a.o. - 2015

Article summary with Exposure treatment in multiple contexts attenuates return of fear via renewal in high spider fearful individuals by Bandarian-Balooch a.o. - 2015

Background and objectives: research shows that after exposure treatment, re-exposure to a previously feared stimulus outside the treatment context can result in renewal of fear. This study investigates whether conducting exposure treatment in multiple real-life context can attenuate renewal of fear.

Introduction

Following successful exposure based treatment of specific phobias, there’s high risk of relapse of anxiety symptoms. Conditioning research has given evidence that the renewal effect is an underlying mechanisms responsible for the return of fear. Research concludes that a renewal of fear may happen when a feared stimulus is encountered outside of the treatment context. So its required to establish methods to enhance generalizability of exposure treatment across contexts.

Lab-based research with humans has shown that extinction treatment in multiple contexts does and does not attenuate renewal – methodological differences? Clinical analogue studies, however, have consistently found that conducting exposure treatment in multiple contexts attenuates renewal of fear when follow up is conducted in novel contexts. Support was found that exposure treatment using multiple stimuli (different spiders) enhances generalizability of exposure treatment.

Vasteenwegen et al. (2007) exposed spider anxious students to videos of a spider in different filmed contexts. Follow up testing in a novel context found a significant renewal of fear indicated by self-report and skin conductance for the group of the spider filmed in one context. Renewal of fear was attenuated for the group exposed to the video of a spider in multiple contexts.

Recently, Shiban et al. (2013) attenuated renewal of fear in 40 spider phobic individuals using virtual spider and multiple virtual contexts that differed by background colour. Self-reported fear and skin-conductance responses showed significant renewal of fear to a virtual spider for those that received exposure treatment in only one virtual context. Those receiving treatment in multiple virtual contexts, renewal was attenuated.

This study aims to extend the findings of these previous reports. Neither study used real-life contextual changes which could’ve limited the applicability of those studies to real-life situations where contexts may: a) vary by multiple sensory cues, b) present unique challenges, c) vary on the informative value of the present cues. Both experience with a task and the informative value of contextual cues have been found to moderate attention to contextual cues and consequently affect the context dependence of learning.

Shiban et al. conducted a behavioural avoidance test (BAT) using a real-life contextual change and spider to examine the generalizability of their virtual reality treatment. But during the test group differences were limited to behavioural avoidance and participants weren’t told to touch the spider, potentially resulted in ceiling effects on fear renewal. But single extinction context group found to be more avoidant of the real spider than the multiple extinction context group.

This study examines whether conducting exposure treatment in multiple real-life context with a real spider enhances the generalizability of exposure treatment to novel contexts and attenuates fear renewal. This study allowed participants to complete any step they were willing at each stage, enhancing likelihood of observing avoidance.

Participants were randomly allocated to either:

  • Control group (BBB): B) treatment in one context, B&B) following up in the same context.
  • Single extinction context group (BEF): B) treatment in one context, E&F) follow up in novel context.
  • Multiple exposure context group (BCDEF): B, C, and D) treatment in three different contexts, E&F) each follow up in a different context.

Follow up testing was conducted one week and four weeks after treatment for all groups. It was hypothesized that there would be renewal of fear for the BEF group. Also hypothesized that renewal of fear would be attenuated for the BCDEF group.

Method

Participants

46 moderate to extremely fearful participants scoring between 17-26 on the Spider Phobia Questionnaire (SPQ) participated for treatment benefits and/or in exchange for course credit. Recruitment was via website advertisement or mass testing sessions using SPQ during university classes. Participants were randomly assigned to the three conditions, and group membership was independent of gender.

Therapist

To ensure consistency in treatment adherence and pace of treatment across participants, the same exposure hierarchy was used for all participants, a treatment manual was devised and used at each session, and the researchers frequently discussed adherence to the treatment manual.

Apparatus

Non-harmful spider was used – nephila plumipes. Same spider was used throughout the experiment. The experimental contexts were locations in the university campus and were counterbalanced across groups and phases of the experiment.

Procedure

The procedure consisted of three sessions lasting a maximum of 4 hours. Initial session was 35 minutes of pre- and post-exposure assessment, and 1.45-hour exposure treatment. Follow up testing sessions one (FU1) and four weeks (FU2) after exposure were 35 minutes each. Participants were randomly assigned to the treatment groups.

Spider was placed 3 meters opposite the participant and the participant had to perform a self-chosen step on the hierarchy and report their fear level at that distance. They rated their maximum fear at the current step and anticipated fear of the next step of the hierarchy. If a participants fear during a step dropped to 1- or below on the 100 point SUDS (subjective unit of distress) scale, they had to perform the step again without the therapist being present.

Treatment was completed when all steps of the hierarchy were completed. They were given a 5-minute resting period without the spider to provide a post-treatment baseline. Next they were asked to complete the last step of the hierarchy again without the therapist present, providing a post-treatment SUDS.

Results

There was a significant increase in SUDS, HR, and avoidance measures for all groups between post-treatment and follow up sessions. This increase is probably due to the expected spontaneous recovery effect caused by the delay from post-treatment to follow up. The BEF group showed an overall pattern of larger fear than the BBB and BCDEF groups. This indicates that renewal of fear occurred and that it was attenuated by conducting exposure treatment in multiple contexts.

Discussion

Study showed successful treatment of moderate to high spider fear using one session exposure treatment. Results showed that renewal of fear happened when exposure was conducted in one context when subsequent fear tests were conducted in novel contexts, confirming the first hypothesis.

Strength of this study is that renewal was found with a triad of measures including, verbal, physiological, and behavioural measures.

Other methods to attenuate fear renewal have been identified. Future research could combine extended exposure or context similarity with exposure in multiple contexts to examine if this maximizes treatment benefits and reduces likelihood of renewal.

The effects of extinction treatment in multiple contexts on renewal of fear can be explained using the memory model of learning. During fear acquisition learning CS-US associations (spider-pain) are learnt and stored in memory. During extinction treatment, CS-noUS associations (spider-no pain) are learnt and stored in memory alongside CS-US making the relationship between CS and US ambiguous. When the feared object (CS) is encountered following treatment, the ambiguity between the CS and US is resolved by memory retrieval cues from the environment. If the cues in the follow up context overlap with the treatment context more than the acquisition context, the CS-noUS association will be retrieved and renewal of fear attenuated. If not CS-US association is retrieved and renewal of fear will occur.

So the more cues present form the exposure treatment context at follow up, the more likely that the CS-noUS association will be retrieved and renewal will be attenuated. For the BCDEF group, conducting exposure treatment in multiple contexts theoretically created more overlapping cues between the exposure treatment and follow up contexts when compared to the initial fear acquisition context and novel follow up contexts, so their renewal of fear was attenuated. For the BEF group (single exposure context) it seems the overlap in contextual cues between treatment and follow up weren’t enough to avoid the CS-US association retrieval, leading to renewal of fear.

Some limitations: it used moderate to high spider fearful participants rather than people with spider phobia. Secondly, it used the same therapist throughout the experiment meaning they weren’t blind to the therapist – could result in some bias in the results. This was fixed by having the therapist be absent during post-treatment and follow up.

In conclusion, the study observed verbal, physiological, and behavioural evidence that renewal can be attenuated by conducting exposure treatment in multiple contexts. This provides clinicians with a better understanding of the role of context in relapse and how the likelihood of relapse can be reduced by making small but important adjustments to already empirically validated behavioural treatment protocols.

Access: 
Public
Article summary with Interpersonal processes in social phobia by Alden & Taylor - 2004

Article summary with Interpersonal processes in social phobia by Alden & Taylor - 2004

Social Phobia/Anxiety Disorder is an interpersonal disorder, a condition where anxiety disrupts a person’s relationship with others. This paper aims to determine what’s known about the interpersonal aspects of social anxiety and understand how it effects the development of relationships.

Interpersonal theory incorporates developmental experiences, social cognition, motivation, and social behaviour in a cohesive model and can therefore provide a structural framework for this review – using different research areas.

Social anxiety is conceptualized and measured differently across domains. The various concepts are related but not interchangeable. E.g. developmental researchers study temperaments of behavioural inhibition and social timidity, whereas personality/social psychologists study shyness and social anxiety (as situation-induced states) and personality disorder.

1. Interpersonal Perspective

Interpersonal models of psychopathology share the assumption that good social relationships are tied to a person’s psychological well-being and that poor social relationships contribute to psychopathology. Central feature of this perspective – self-perpetuating interpersonal cycle. We expect people to treat us the same way now that they have previously, and tend to repeat the behavioural strategies learned to handle earlier events. Furthermore, people expecting others to respond positively to them engage in behaviours eliciting favourable responses, vice versa those anticipating negative responses adopt self-protective strategies likely eliciting negative responses.

Interpersonal models also maintain that dysfunctional interpersonal patterns result from ongoing interaction between the individual and the social environment. Our relationships shape our habitual social behaviour as well as our sense of self and others.

2. Social Anxiety Disorder and Social Relationships

People with social phobia generally have fewer social relationships than others (fewer friends, dating and sexual relationships, less likely to marry). But we don’t know much about how they function in the relationships they do develop. E.g. study on marital relationships in social phobic patients found patients to report greater life satisfaction than those without partners but also more marital distress.

A study using structured interviews to assess university students’ relationships with friends, acquaintances etc. found social anxiety to be associated with various dysfunctional strategies in those relationships – expected strategies of nonassertiveness and avoidance of emotional expression/conflict. Socially anxious individuals reported over-reliance on others – result of dependence on the few relationships they have. Over-reliance (and nonassertiveness) was found to mediate the relationship between social anxiety and chronic interpersonal stress. Together these findings indicate that even when socially anxious people develop relationships, they view them as less intimate, functional, and satisfying than those without social anxiety.

3. Self-Perpetuating Interpersonal Cycles

Writers from different perspectives have observed that socially anxious people behave in ways that lead to negative social outcomes. Suggests that people with social phobia may establish negative interpersonal cycles between themselves and others in which they adopt behavioural strategies evoking negative reactions.

3.1. Behavioural Patterns

Behaviours commonly associated with social anxiety are low social skill, nonassertiveness, and visible anxiousness. Differences between socially anxious and nonanxious people on specific anxiety-related micro behaviours (poor eye contact, trembling, low self-disclosure etc.) have also been found. Research suggests socially anxious individuals to appear less skillful and anxious than others, providing suggestions as to behaviours that contribute to this impression.

Variability in behavioural patterns have been found in socially anxious populations – findings point to individual differences in how socially anxious people experience their interpersonal issues.

Other researchers report evidence of critical or angry behaviour in socially anxious samples. One study found that shy people reported feeling critical and non-affectionate towards their friends/others. Another found that social phobic patients reported more state/trait anger, and a tendency to express anger when criticized or treated unfairly, or even without provocation. Studies in the context of domestic violence found that 35% of wife batterers scored above the clinical cutoff on avoidant personality disorder. Avoidant personality traits predicted not only assault, but also spousal murder.

3.2. Others’ reactions

The interpersonal perspective posits that people with psychological problems often elicit negative responses form others. Shy individuals generally rate negatively on interpersonal dimensions (warmth, likeability etc.) by objective interviewers and close friends. Research indicates shy people to be seen as less intelligent by peers though there’s no actual association between social anxiety and intelligence. Conversely, long-term acquaintances rate shy people more positively than recent ones, suggesting other may become more positive about them with longer exposure.

Several studies found others less likely to want future interactions with socially anxious students after an initial discussion. It’s concluded that the social behaviour of anxious students led their partners to disengage from relationship development. Anxiety-related behaviour was one main factor precipitating disengagement, as well as failing to reciprocate others’ self-disclosures.

Authors concluded that socially anxious students were rated negatively by their friends, and they irritated/alienated strangers quickly.

3.3. Summary

Research suggests socially phobic people to show distinctive and less functional social behaviour than others. There’s an increase in empirical support for interpersonal variability in the behaviour of socially anxious people. They seem to evoke less positive reactions from others, even in brief encounters and studies suggest that the absence of prosocial behaviour is as important to others’ reactions as visible signs of anxiety.

These findings could have clinical implications. Socially anxious people worry about displaying anxiety-related symptoms, when the use of prosocial behaviours could be more important in developing relationships. Important for future research is determining if behavioural patterns and social responses found in social phobic individuals are specific to social phobia or shared with other psychological disorders – could arise form comorbid conditions rather than just social phobia, e.g. depression.

4. Social Skill Deficit or Self-Protective Strategy

If people with social anxiety behave in ways that disrupt relationship development, the next question is why. Traditional explanation for this is that they have social skill deficit’s, and failed to learn effective social behaviour – anxiety is partially a reaction to those deficits and negative responses. However, it seems to depend on the social context whether they will display avoidant/maladaptive social behaviour. Speculation that dysfunctional behaviour results from cognitive and emotional processes activated through social cues – leading to self-protective behaviour.

4.1. Summary

There’s some flexibility in the strategies used by socially anxious people to deal with social events. Self-protective behaviours elicited by social cues is inconsistent with the concept of a social skill deficit – which implies a chronic behavioural deficiency. More researched is needed to determine if there are limits on behavioural flexibility in social phobic people.

5. Cognitive Processing of Social Information

Cognitive theorists propose that social cues activate negative beliefs and assumptions about the self and others, leading to selective processing of threat information and biased interpretation of events – heightens anxiety and leads to the use of self-protective behavioural strategies.

Three topics of interpersonal relevance in cognitive model research:

  1. Critical aspect of negative social schema is relational information
  2. Whether people with social phobia show selective processing of negative interpersonal cues
  3. Whether people with social phobia show negative biases in their interpretation of other people

5.1. Social Schema and the Self

Researchers in social cognition posit that social anxiety arises from the activation of a relational schema – knowledge structures based on experience with significant others. It’s argued relational information forms a key part of our sense of self. Some suggest that social anxiety arises when someone becomes aware of a discrepancy between knowledge about the actual- and ought-self (the self one believes others things one ought to be). Baldwin (1992) suspected that socially anxious people develop negative schema about the self in relation to others, which are readily activated by social cues. Schema result in negative expectations for social events -> anxiety.

Research has demonstrated priming procedures to increase awareness of discrepancies between the actual and ought-self. Authors concluded that activating relational information is critical to the onset of social anxiety.

Baldwin went further to examine how activation of information about others affects someone’s subjective experience of the self. He concluded that information about others is intertwined in memory with information about self, and activating one type of information affects the other.

5.2. Selective Attention to Social Cues

Cognitive models suggest socially anxious people to selectively process threat-related information. Threat information can be internal or external.

In support of selective processing, a study found that socially phobic people showed selective attention to negative social cues in a public speaking task. Other studies found them to display memory omissions for partner-related information in social interactions.

5.3. Interpretation of Social Cues

According to cognitive writers, selective processing of threat cues leads to biased interpretation of social events. Interpretation biases have been studied in three contexts: 1) judgments of self, 2) judgments of others’ reactions to self, 3) interpretations of other people’s behaviour/characteristics apart from their reactions to oneself.

Not as clear if social phobic people display biases in their interpretations of other people’s behaviour and characteristics. Studies on social interpretation yield inconsistent results.

Two lab studies of patients with social phobia weren’t able to find negative biases in patients’ interpretations of others’ characteristics in getting acquainted discussions. A recent study suggested that negative interpretation bias may be confined to people with particular social developmental histories.

5.4. Summary

The literature supports the idea that activation of relational information may play a key role in triggering social anxiety. One possibility is that relational information activation exerts its effect by altering the person’s subjective sense of self.

Right now, findings provide greater support for the existence of negative interpretation biases in judgments of self and others-in-relation-to-self than for biases in interpretations of other people’s general characteristics.

Interpersonal writers propose that habitual interpersonal patterns are the result of a social development process that begins in childhood interactions and with significant others and continues through peer relationships in adolescence.

6. Social Pathogenesis

Research gives persuasive evidence that there are heritable, biological processes that increase vulnerability to social anxiety. Particularly, the presence of behavioural inhibition (BI) early in life is shown to predict social timidity in childhood and adolescence. Kagan concluded that BI is “influenced in a major way by environmental conditions existing during the early years of life.

Developmental researchers have identified a number of early social learning experiences that are associated with behavioural inhibition, shyness, and social anxiety. For example, parental encouragement of open communication and social involvement is associated with less shyness at 12- and 24- months, and reductions in social inhibition were observed in temperamentally active infants whose mothers weren’t too responsive to fretting and crying. Conversely, patients with late-onset shyness are more likely to report parental abuse than those with early-onset shyness, who were most likely to have shy parents. Social anxiety can be produced by adverse social experiences even in children who aren’t initially inhibited.

Two themes: 1) behaviour between children and parents is interactive with each party influencing the other, 2) there’s variability in the interpersonal environments associated with social phobia.

7. Self-Perpetuating Cycles

Association between Bi and dysfunctional child-rearing styles begs the question of whether parental behaviours are causal factors in the development of social fears or responses to the child’s temperament.

Several researchers address bi-directional models of parent-child relationships. Other research has demonstrated that the physiological correlates of behavioural inhibition were moderated by the security of the attachment bond between mother and child.

Research over the last decade increasingly supports a bi-directional relationship where inhibited/anxious children and their parents display a cyclical interaction pattern, perpetuating social anxiety.

8. Variability in Social Learning Experiences

Research points to at least three dimensions that characterize early social experiences in these individuals: 1) parental over-protection and control, 2) parental hostility and abuse, and 3) lack of family socializing. Over protective, intrusive parental behaviour has received greatest research attention.

Developmental studies revealed that mothers of anxious-withdrawn children responded to their children’s shy behaviour with attempts to direct and control how the child behaved. Anxious children also displayed the highest degree of noncompliance, a pattern that may perpetuate dysfunctional transactions between mothers and their children.

But some patients with social phobia also report histories of physical and sexual abuse. An analysis found that sexual assault by a relative and exposure to verbal aggressiveness between parents had unique effects on social phobia onset in women. Childhood physical and sexual abuse seems to pose a risk for later development of social phobia.

Social phobic patients also report more emotional abuse and neglect in their childhoods. Observational studies confirm self-report findings, mothers of extremely anxious-withdrawn children tended to use non-responsiveness, statements of devaluation, or criticism and punishment in response to their child’s behaviour in a lab task.

Some patients report having limited exposure to social interactions during their development with parents encouraging this. Restricted social exposure may exacerbate social fears by constraining the development of social skills or limiting opportunities to learn that social situations can be harmless.

Interestingly, the three social development dimensions mentioned before were found to be largely independent. Principle of equifinality – there can be multiple pathways through which psychological disorders develop.

Most studies don’t address whether negative peer interactions are the cause or outcome of social anxiety, or both. It’s been concluded that shyness interferes with friendship formation, but doesn’t evoke rejection, whereas rejection can exacerbate the cognitive aspects of shyness. Supports an interactive model of social anxiety and peer relationships.

8.1. Summary

Social developmental factors and social anxiety have been addressed in numerous ways. Each method of assessment is limited in its own way, but taken as a whole the research supports the idea that the pathogenesis of social anxiety resides in an interaction of innate temperament with a family environment that either fails to help fails to help children overcome their innate timidity or exacerbates their fears through overprotection, control, abuse, isolation, or modeling.

9. Interpersonal Processes in Treatment

First question is whether interpersonal heterogeneity found in developmental histories and social behaviour of social phobic people affects treatment response. There’s little research on this, but studies indicate that some interpersonal patterns associate with poor treatment outcome. Example: studies on avoidant personality reported people with ‘warm’ problems (fear of offending/disagreeing with others) reflects a desire to maintain contact with others – more likely to benefit from CBT teaching relationship development skills. Those with ‘cold’ problems (emotional detachment/hostility) – less likely to benefit from treatment.

Second question is whether social anxiety and dysfunctional interpersonal behaviour characterizing social phobia impairs patients’ ability to collaborate with therapists and benefit from treatment. Different conclusions reported. One explanation for inconsistent findings could be that individual and group therapy place different demands on patients. It’s also possible that interpersonal relationships with therapists are crucial to motivating those individuals.

One study showed that difficulties establishing therapeutic alliance may be confined to some individuals. E.g. self-reported childhood parental abuse was associated with weak therapeutic alliance and more negative patient-therapist interactions. It’s unclear if negative treatment response was a direct function of weak working alliance or due to some pre-existing characteristic of patients that affect their relationships with therapists as well as producing difficult interpersonal problems.

Three models of patient-therapist relationship:

1) Dynamic interpersonal therapy: patient-therapist interactions are viewed as an opportunity for patients to experience how biases in their expectations and interpretations of others’ behaviour lead them to engage in maladaptive behaviours in therapy sessions.

2) Second view is that it is an essential or at least facilitative factor in treatment. Treatment techniques are more likely to be effective if delivered by a supportive, empathic therapist.

3) Third view is that it is a marker of treatment progress. Therapist or patient dissatisfaction with the working alliance signals that treatment isn’t effectively addressing a key element of the patient’s problem.

10. General Discussion

Social anxiety is associated with fewer and more negative social relationships at all stages of life. To close, this review draws attention to the fact that in addition to suffering from anxiety-related symptoms, people with social phobia have a history of interpersonal experiences that shape their beliefs about themselves and others, their interpersonal strategies, and their response to treatment. Their beliefs and strategies trap them in an interpersonal cycle that prevents them from accomplishing those goals. The ultimate goal of treatment should be enabling social phobic people to establish closer and more satisfying interpersonal relationships.

Access: 
Public
Article summary with Interpersonal subtypes in social phobia: Diagnostic and treatment implications by Cain a.o. - 2010

Article summary with Interpersonal subtypes in social phobia: Diagnostic and treatment implications by Cain a.o. - 2010

Social phobia has been reported to be the most common anxiety disorder in the United States, with a lifetime prevalence rate of 13.3%. It’s characterized by a “marked and persistent fear of one or more social or performance situations in which the person is exposed to unfamiliar people or possible scrutiny by others”. It often follows a chronic course, resulting in substantial impairments in vocational and social functioning and causing these individuals to engage in avoidance behaviours allowing them to stay away from feared social situations.

A common critique of the social phobia diagnosis is the inclusion of a generalized subtype. The DSM-IV says the generalized specifier should be ‘when the individual’s fears are related to most social situations’.

Though research has consistently shown that generalized social phobia presents the more severe manifestation of the disorder, there’s a number of problems with the current DSM-IV criteria for subtyping social phobia. Particularly it doesn’t explicitly define the number/type of social situations that comprise the generalized subtype. This results in research groups having developed different operational definitions for generalized social phobia making comparison across empirical studies difficult. DSM-IV symptom-based classification criticized for failing to create qualitatively different subgroups.

Interpersonal assessment may provide a more clinically useful way to identify qualitatively different subgroups of socially phobic individuals by identifying patients based on their distinct ways of responding to social situations. Form an interpersonal perspective, could be argued that the DSM-IV criteria doesn’t fully capture the range of maladaptive responses to social situations that could be exhibited by socially phobic people.

Interpersonal Classification of Social Phobia

Applying interpersonal theory to diagnosis, it’s been argued that interpersonal functioning is an essential component of the diagnostic process in addition to the assessment of symptoms. It’s been pointed out that often the most useful aspects of diagnoses are psychosocial in nature and that most diagnoses are made on the basis of observed interpersonal behaviour.

One method for deriving an interpersonal classification is to use the Inventory of Interpersonal Problems – Circumplex Scales (IIP-C). It’s based on interpersonal theory, providing a nomological framework for articulating both adaptive and maladaptive dynamic interpersonal processes.

The original IIP was revised using a Circumplex model that can be conceptually organized in a circular manner along the dimensions of dominance and affiliation. It contains 64 items divided into eight subscales. These dimensions provided the basis for Leary’s interpersonal Circumplex (see figure 1) and are considered to be the basic elements of interpersonal behaviour. Circumplex quadrants are useful summary descriptors of interpersonal behaviour. Computing scores on each axis can give coordinates to define the location of the predominant interpersonal problem pattern. It also contains a general factor equivalent to mean level of reported interpersonal distress.

Pathoplasticity

Using the IIP-C to form interpersonally based subtypes of socially phobic individuals is based on a theory of pathoplasticity. Pathoplasticity is characterized by a mutually influencing, nonetiological relationship between psychopathology and another psychological system. Psychopathology and another psychological system influence the expression each other, but neither one is the exclusive direct causal agent of the other, which may be the case in an etiological or spectrum relationship. Pathoplasticity recognizes that the expression of certain maladaptive behaviours, symptoms, and mental disorders all occur in the larger context of an individual’s personality.

The interpersonal paradigm asserts that maladaptive self-concepts and disturbed interpersonal relations are key elements of the phenotypic presentation of all psychopathology. It’s been suggested that using an interpersonal paradigm to systematically account for these elements provides additional and valuable information beyond diagnosis itself for both treatment planning and developing testable hypothesis regarding the etiology and maintenance of psychopathology. Differences in interpersonal diagnosis will affect the manner in which patients express their distress. It’ll also influence the type of interpersonal situation they feel is needed to regulate their self, affect, and relationships.

Kachin et al. (2001) (among others) have described procedures to determine the presence of a pathoplastic relationship using the IIP-C. If patients with a particular disorder aren’t defined by a uniform interpersonal profile, and they’re not defined by a complete lack of systematic interpersonal expression, then it’s necessary to examine if a pathoplastic relationship exists. Individuals with a certain disorder are subjected to cluster analyses based on their responses to the IIP-C to confirm the existence of distinct groups with characteristic interpersonal problem profiles. If data supports the clusters -> necessary but not sufficient evidence for pathoplastic relationship.

A number of investigations found that individual differences in interpersonal problems exhibit pathoplastic relationships with mental disorders, pathological symptoms, and maladaptive traits. Using the IIP-C, a study found two distinct subtypes of socially phobic undergraduates with distinct interpersonal features suggesting qualitatively different responses to feared interpersonal situations. The first subtype: reported difficulties with anger, hostility, and mistrustfulness (cold dominant group). Second subtype: reported difficulties with unassertiveness, exploitability, and overnurturance (friendly-submissive group). No significantly differences between the two subtypes on level of interpersonal distress, and they weren’t significantly different on depression or other disorders comorbid to social phobia, providing evidence for pathoplasticity.

This Study

Three main goals. First, to replicate the results of Kachin et al. (2001) by using an interpersonally based approach to subtype socially phobic people using the IIP-C in a clinical sample at an outpatient psychotherapy clinic. Second, to provide evidence for the pathoplasticity of social phobia. Third, examine subtype differences on posttreatment measures of general symptom severity, level of social anxiety, psychological well-being, level of optimism, and satisfaction with social functioning. Extensive research using the IIP-C has shown that friendly-submissive interpersonal problems are positively related to psychotherapy outcome, whereas hostile-dominant problems are negatively related to outcome.

Method

Patients and Therapists

Data was collected for this naturalistic study at the University of Bern, Switzerland, in their outpatient psychotherapy clinic. Clinic accepts patients suffering from wide range of problems and disorders, except psychotic disorders and substance use disorders. This study used data from 20 different therapists.

Analyzed the data of 77 patients diagnosed with DSM-IV social phobia. Assessors at the clinic didn’t specify generalized social phobia, meaning they were unable to analyze data on the diagnostic-based subgroups of social phobia.

In this sample, 100% of the patients met criteria for at least one Axis I disorder, and 55.8% met criteria for more than one Axis I disorder. These diagnoses included social phobia, major depressive disorder, specific phobia, and many others. There was no systematic assessment of Axis II pathology in this sample.

Treatment

Treatment model at the University of Bern outpatient clinic draws on empirical findings from basic psychology, neuropsychology, and various theoretical models as the basis for an integrative framework for empirically supported psychotherapy. Grawe (1997) articulated five change mechanisms necessary for psychotherapy: a) the therapeutic bond, b) problem activation, c) resource activation, d) mastery, and e) motivational clarification.

Interpersonal Measures

Interpersonal problems were assessed using the German version of the IIP-C. the IIP-C assesses interpersonal problems across eight scales emerging around the dimensions of dominance and love: domineering, vindictive, cold, socially avoidant, nonassertive, exploitable, overly0nurturant, and intrusive.

Symptom and Outcome Measures

  • Bern Subjective WellBeing Inventory
  • Brief Symptom Inventory
  • Taylor Manifest Anxiety Scale
  • Insecurity Questionnaire

Retrospective Outcome Measures

  • Changes in Life Domains Questionnaire
  • Revised Questionnaire of Changes in Experiencing and Behaviour

Results & Discussion

The study addressed three major aims.

  1. First aim was to replicate results of Kachin et al. (2001) in a clinical sample. Two distinct subgroups of socially phobic patients were found: a friendly-submissive cluster and cold-submissive cluster. The two subgroups exhibited highly protypical circumplex profiles and nonoverlapping circular confidence intervals, suggesting that patients within each of the clusters were reporting distinct interpersonal problems. This study also found a cold-submissive cluster, suggesting that social phobia patients in this sample reported more submissive interpersonal problems overall rather than problems associated with dominance.
  2. A second aim was to provide evidence for the interpersonal pathoplasticity in social phobia. The formation of two interpersonally distinct subtypes provides necessary but not sufficient evidence of interpersonal pathoplasticity. Additional analyses show no significant differences between the two subtypes on gender, comorbidity, interpersonal distress, and pretreatment symptom measures. Results provide sufficient evidence for interpersonal subtypes in social phobia in this sample.
  3. Third aim was to compare the two interpersonally based subtypes at posttreatment on several outcome measures. Posttreatment comparisons indicated that friendly-submissive social phobia patients exhibited lower levels of social anxiety ad higher levels of well-being/satisfaction at posttreatment than cold-submissive social phobia patients. Friendly-submissive patients showed lower levels of fear of failure/critique, and fear of contact with others and were better able to assert themselves through making demands and being able to say ‘no’ than cold-submissive patients. Friendly-submissive also reported higher levels of optimism than cold-submissive.

On a measure of general psychopathology, though, there were nog significant difference between the two subtypes at posttreatment, suggesting that what might matter most in the treatment of social phobia is targeting social fears and maladaptive interpersonal behaviours instead of overall level of psychopathology.

Interpersonal Complementarity and Interpersonal Motives

The posttreatment differences demonstrated by the two subtypes of socially phobic patients may be attributed to interpersonal complementarity and differences in interpersonal motivation. Interpersonal complementarity is defined as: “a person’s interpersonal actions tend to initiate, invite, or evoke from an interactant complementary responses,” – Kiesler (1983). Kiesler suggested that in a self-fulfilling manner, certain types of rigid, maladaptive interpersonal behaviours actually increase the probability that an individual will elicit the type of response from others that reinforces their fears and maladaptive behaviours.

Empirical studies of complementarity have found that people often don’t exhibit the expected behavioural complementarity. Horowitz et al. (2006) noted that reactions to behaviour are not only guided by the interactional quality of the person’s behaviour but also by the suspected motives of the person. Important to note that interpersonal behaviour can be ambiguous, and one behaviour can have different underlying motives, making it difficult to understand what’s behind the hostile behaviour.

To address the motivational dimension of interpersonal behaviour, Horowitz (2004) expanded the principle of complementarity by describing that individuals have interpersonal motives influencing their behaviour during interpersonal situations and that these motives are also organized around the dimensions of agency and communion. Agentic motive is related to a need for autonomy, communal motive related to a need for intimacy. He argued that people develop strategies to satisfy their motives, but the chronic frustration of interpersonal motives leads to the development of interpersonal problems and distress.

Grosse Holtforth, Pincus, Grawe, and Mauler (2007) aimed to clarify the relationship between interpersonal problems and underlying interpersonal motivations. They found that high scores on friendly-submissive interpersonal problems were associated with highly valuing interpersonal recognition and dreading separations from others, accusations from others, and being hostile. They also found that cold-submissive interpersonal problems were associated with dreading to make oneself vulnerable. Friendly-submissive patients may be seeking more interpersonal recognition from others by employing rigid and maladaptive interpersonal strategies that focus on being excessively compliant and overly friendly. They seem to fear displeasing others and being ignored or disliked and therefore strive to be excessively pleasing. Cold-submissive patients fear being hurt in social situations and may therefore try to minimize social contact and avoid intimacy and relationships with others as a way of protecting themselves from rejection.

Clinical Implications

Results of this study suggest that using the IIP-C to assess interpersonal functioning could provide additional information to the current DSM-IV approach and that traditional psychotherapy may need to modified to better address specific interpersonal problems and interpersonal motives. But they also suggest that an interpersonal classification for social phobia may help improve diagnostic clarity and inform treatment conceptualization and planning.

Incorporating an interpersonal problem component in the diagnostic assessment process could lead to better assessment of interpersonal distress and maladaptive behaviours. Several research studies have shown that friendly-submissive interpersonal problems are positively related to psychotherapy outcome, whereas cold-dominant interpersonal problems are negatively related to outcome in both cognitive-behavioural and psychodynamic therapy.

Based on interpersonal traditional, specific interventions could be tailored to target the therapeutic relationship, the patient’s interpersonal problem areas, and the patient’s interpersonal motivations more effectively.  Therapists might need to avoid responding to patients in complementary ways to avoid reinforcing their maladaptive relational patterns and to stimulate new social learning opportunities within the therapeutic transaction.

Modifications to traditional cognitive behavioural treatment for social phobia may be needed to target specific interpersonal problem areas. Research by Newman et al. has shown the importance of modifications to traditional CBT for anxiety disorders, they examined the efficacy of an integrative psychotherapy for generalized anxiety disorder. They found that their integrative treatment resulted in clinically significant change in GAD symptomatology. These findings highlight the importance of designing treatment modifications addressing interpersonal problems to improve treatment outcome for all patients.

It’s likely that both friendly-submissive and cold-submissive patients would benefit from exposure to feared social situations. But more attention should be given to how each subgroup of socially phobic patients would respond to other CBT interventions like social skills and intimacy skills training. Friendly-submissive patients may be more responsive to relational skills training than cold-submissive who may need to initially focus more on interventions that increase treatment compliance and decrease fears of social rejection.

Limitations and Future Directions

  • One limitation of this study was the use of a naturalistic design, limiting the available data and the conclusions that can be drawn about psychotherapy outcome between two clusters of socially phobic patients.
  • A second limitation was the small sample size and the reduction of patients from pretreatment to posttreatment, limiting the statistical power and external validity. However, the fact that they were able to examine clinical data from an outpatient psychotherapy clinic allowed for maximization of generalizability of the results to other clinics treating socially phobic patients.
  • Third possible limitation is using an exclusively German and Swizz population, may restrict generalizability to other cultures given the use of interpersonal constructs. But it seems that the clinical presentation of social phobia is often consistent between Germanspeaking cultures and the United States, and other European countries.
  • Finally, although this study was limited to selfreport data, future research on interpersonal pathoplasticity should employ methods that code for interpersonal processes during psychotherapy to assess how the two subtypes may respond to their therapist and the therapy over the course of treatment.
  • In conclusion, results suggest that the two distinct subgroups of socially phobic patients differentially responded to the same treatment – future research should investigate techniques that will effectively target maladaptive interpersonal behaviour of both subtypes. Accounting for distinct interpersonal motives underlying maladaptive behaviour patterns could improve diagnosis, treatment planning, and therapeutic outcome.
Access: 
Public
Article summary with Possible role of more positive social behaviour in the clinical effect of antidepressant drugs by Young a.o. - 2014

Article summary with Possible role of more positive social behaviour in the clinical effect of antidepressant drugs by Young a.o. - 2014

Introduction

Two important characteristics of antidepressants: 1) they aren’t as effective as an ideal antidepressant would be, and 2) there’s a delay in their maximal effect. A proposed biological mechanism for the delay in onset is that different classes of antidepressants cause slow changes in pre- or postsynaptic mechanisms that increase serotonin function – responsible for mood improvement. Another proposed mechanism is based on a cognitive neuropsychological model suggesting that antidepressants “change the relative balance of positive to negative emotional processing,” resulting in later changes in mood. This paper suggests another mechanism involving serotonin-induced changes in social behaviour that will improve mood.

Serotonin and Social Behaviour in Animals and Humans

Aggression is a more dramatic aspect of social behaviour. A meta-analysis concluded that serotonin “has an overall inhibitory effect on aggression” in various animals. Findings suggest that serotonin may alter social behaviour along the continuum of agnostic to affiliative. Research suggests this may also be true in humans.

Acute tryptophan depletion increases aggressive responses and decreases affiliative behaviour according to lab tests. Conversely, tryptophan supplements may decrease aggression and increase positive social behaviour. Tryptophan given to schizophrenic patients decreased the number of incidents on the ward requiring intervention. Another study gave aggressive patients tryptophan, leading to a decreased need for injections of antipsychotics and sedatives to control agitated/violent behaviour. The first study found tryptophan to decrease quarrelsome behaviour but not affect agreeable behaviours (possible ceiling effect). This was tested in the latter study, where participants were psychiatrically healthy but in the upper levels of the population distribution for irritability. In these individuals, tryptophan decreased quarrelsome behaviours and increased agreeable ones. This change occurred without an effect of tryptophan on their appraisal of the agreeableness of their interaction partners – suggesting a direct effect on behaviour instead of an indirect effect mediated by changes in participants’’ cognitive appraisal of others. This is consistent with the fact that altered serotonin function can influence social behaviour in organisms with primitive nervous systems.

Effect of Antidepressants on Social Behaviour in Healthy Humans

Seretti and colleagues reviewed 30+ studies where the effects of antidepressants were compared with placebo in healthy participants. They concluded that generally there were no effects on mood. The effects that did occur were more consistent when the antidepressants were given (sub)chronically rather than acutely, and effects included alterations in social behaviour.

Knutson and colleagues found SSRI paroxetine to decrease subjective irritability and increase affiliative behaviour on a dyadic lab puzzle task in healthy volunteers. Tse and Bond conducted 5 studies where the effects of antidepressants were compared to placebo given to healthy participants, results as followed:

  1. Citalopram increased self-directedness but not cooperativeness.
  2. Citalopram had no effect on ratings by roommates, but increased cooperative behaviour in a laboratory game.
  3. Reboxetine, but not citalopram, caused participants to show more cooperative communication with a confederate behaving nonsociably and to give more cooperative communications in a mixed-motive game.
  4. No effect of reboxetine on behaviour along agreeable-quarrelsome dimension, but roommates considered participants more cooperative and agreeable when receiving reboxetine.
  5. Reboxetine had no significant effect on irritability, cooperation, or any other measure.

Variability in results is due to various factors (differences in study design, outcome measures, sample sizes). But several studies found changes consistent with improvements in behaviour along the agreeable-quarrelsome dimension – providing modest support for the idea that antidepressants may decrease agonistic and increase affiliative social behaviours in humans. Reboxetine is suggested to increase serotonin function – serotonin may be a mediator of the effects of antidepressants.

Effect of Antidepressants on Social Behaviour in Patients

Irritability occurs in roughly half of depressed patients, and usually resolves with successful treatment. Reviews suggest that about 1/3 of depressed patients experience anger attacks. One study compared effects of sertraline, imipramine, and placebo on anger attacks in patients with atypical depression and dysthymia. Anger attacks ceased in 50% of patients in active treatment groups compared to 37% in placebo group.

Studies have compared effects of antidepressants and placebo on agonistic behaviour in patients with diagnoses other than depression. One found that fluoxetine decreased anger in patients with borderline personality disorder (BPD), and another found it to decrease irritability and aggression in patients with various personality disorders. However, another study found no effect of SSRI fluvoxamine on aggression in women with BPD. A study treating aggressive schizophrenic inpatients with found that it decreased the frequency of aggressive incidents.

Overall results from various studies support the idea that patients with elevated irritability may respond quicker to treatment with SSRIs than patients with only depressed mood. Results provide evidence that SSRIs can decrease aggression, anger, and irritability.

Social Interactions During Depression and Depressed Mood

Hames and colleagues reviewed interpersonal processes thought to be involved in initiating and maintaining depression. Depressed patients tend to have social skills deficits, seek reassurance excessively while also seeking negative feedback and exhibit both interpersonal inhibition and dependency. Many studies looked at how depressed mood influences social behaviour in interaction partners – no direct evidence that the response of others toward those with depressed moods was mediated directly by irritability/anger associated with depression. But it’s a plausible explanation given that quarrelsome/aggressive behaviours tend to be reciprocated by others.

Complementarity in Social Interactions and its Implications for Mood Regulation

People respond to the behaviours of others in a way governed partly by the specific behaviour of the other. It’s proposed that a person’s interpersonal actions evoke a complementary response leading to a repetition of the person’s original actions and that a certain level of intensity tends to evoke a response of similar intensity. Many studies support the idea that quarrelsomeness tends to evoke quarrelsomeness and agreeableness evokes agreeableness, though the exact response can be modulated by the context. Taken together, research suggests that in most people, more agreeable behaviours toward others will tend to be reciprocated and result in a more positive mood. Vice versa for quarrelsome behaviours – resulting in a negative mood.

Complementarity of behaviours, together with changes in mood/appraisal of others, could contribute to an iterative cycle in everyday life.

Possible Role of Changes in Social Behaviour Along the Agreeable-Quarrelsome Dimension in the Effects of Antidepressants on Mood

Research suggests:

  1. Most antidepressants enhance serotonin function
  2. Serotonin influences behaviour along the agreeable-quarrelsome dimension
  3. Depressed patients tend to be irritable and sometimes have anger attacks
  4. People tend to respond to quarrelsome behaviour with quarrelsome behaviour – same for agreeable behaviour.
  5. More quarrelsome interactions tend to be associated with negative mood, and agreeable behaviour with positive mood.

Hypothesis based on the effects of antidepressants and serotonin on mood -> changes in social behaviour are a way in which antidepressants can improve mood. The change in mood after each interaction will be small, but after many interactions the effect should be much greater. Consistent with the idea of slow onset of action of antidepressants.

Increases in positive affect associated with more positive social interactions and decreases in negative affect associated with fewer negative interactions may play a role in improvement of mood in depressed patients. But increases in positive affect may be more important than decreases in negative affect. Research show positive and negative affect to be separate dimensions rather than opposites on one continuum. Enhancement of positive social behaviour may be more primary in the action of antidepressants than the inhibition of negative social behaviour.

Role of More Prosocial Behaviour and Other Mechanisms in Mediating the Response to Antidepressants

Slow onset of antidepressants is possibly due to initial inhibit firing of serotonergic neurons, though adaptive changes occurring result in important increases in serotonin function. Research on tryptophan suggests small increases in serotonin release to be enough to promote more positive social interactions. But improvement in mood mediated by changes in social behaviour may be important in initial effects of antidepressants, and may be augmented by direct effects on mood associated with larger increases in serotonin functioning happening later.

Cognitive neuropsychological model of antidepressant action suggests that from initiation of treatment, antidepressants create implicit positive biases in attention, appraisal, and memory and that delay in effects on mood are because of the time it takes for these emotional processing biases to influence mood.

The cognitive neuropsychological and social interaction models suggest that antidepressant alter responses to stimuli. In the cognitive model, change is to a more positive appraisal of neutral and emotional stimuli. In the social model, the stimuli are people whom a depressed patient encounters daily – change is a shift away from quarrelsome and toward agreeable behaviour. The important difference is in how the altered response to a stimulus improves mood. In the cognitive model changes occur in the mind (positive appraisals of stimuli) and in the social model the change is in behaviour.

The models are different but not mutually exclusive. Antidepressants may be moving to more agreeable behaviour while simultaneously reinforcing this change through more positive cognitive appraisal of situations. This initiates a cycle of more positive social behaviour resulting in a clinically significant improvement tin mood. 

Conclusion

Evidence is stronger for increased serotonin function and antidepressants decreasing aggressive behaviour than for increasing agreeable behaviour. There’s inconsistency in results on the effects of antidepressants on behaviour – could be attributed to use of measures. Also a lot of the evidence for agreeable social behaviour is mainly based on studies on healthy rather than depressed people. Lastly, if more positive social interactions are a clinically significant factor in the action of antidepressants, then patients who have more social interactions early in treatment may be expected to respond better to it.

Access: 
Public
Article summary with The cognitive neuropsychological model of antidepressant response by Walsh & Harmer - 2015

Article summary with The cognitive neuropsychological model of antidepressant response by Walsh & Harmer - 2015

The direct action of antidepressants remediates negative biases in affective processing (AP) at a neuropsychological level. These actions occur early in treatment before an improved mood. Majority of research has been conducted using antidepressants mainly affecting serotonin or noradrenaline activity in major depressive disorder (MDD). 

Introduction

The acute neuropharmacological actions of most antidepressants are characterized by enhancing serotonin and/or noradrenaline neurotransmission, which is immediately detectable after administration. However, repeated administration of the drug over multiple weeks is required before a clinically important therapeutic effect is observed. It’s thought that delayed onset is due to numerous neuroadaptive changes that happen between first administration and the onset of the desired effect. Recently, focus has shifted to neurobiological models aiming to explain the delay by introducing the notion  of antidepressant-induced activation of second messengers and subsequent changes in gene expression. But these models are limited as they fail to explain how these actions relate to an improvement in mood and overall symptomatic relief. Furthermore, antidepressants don’t elevate mood in healthy individuals. 

These findings lead to our hypothesis: antidepressants don’t act as direct mood enhancers, but rather give way to the cognitive neuropsychological model (CNM) which aims to explain the gap between first administration of an antidepressant and the onset of its mood improving effect.

The CNM in antidepressant response

In the context of MDD, individuals tend to extract positive affective information leading to negative affective biases (NAB), being the basis and fuel of MDD at once. CNM suggests that, at a neuropsychological level, direct action of antidepressants leads to an extraction of negative affective information from a variety of social and affectively loaded stimuli, and that these actions occur early in the treatment prior to an improved mood. The suggested model allows for a time interval ranging from first drug intake to a clinically important elevation in mood in later treatment.

The role of NABs in depression

MDD patients and at-risk individuals display negative biases in emotional and reward processing across cognitive domains. Such negative biases link to abnormal activity in the limbic and striatal circuitry involved in the initial appraisal and memory of affective stimuli. MDD patients show abnormal responses toward reward, punishment and performance feedback. These deviant responses are associated with reduced function in frontostriatal systems in depressed, remitted depressed, and at-risk individuals.

NAB is linked to relapse and exhibited by people at risk of MDD by virtue of high neuroticism and family/personal history of MDD. Concludes that NAB appears to play a role in the vulnerability to/aetiology of MDD rather than being a secondary consequence of low mood.

The effects of antidepressants on AP

The repeated, subchronic, and acute effects of antidepressants on AP have been assessed in healthy volunteers and MDD.

Subchronic antidepressant (SSRI and SNRI) treatment increases processing of positive affective information, like positive self-descriptive word and propensity to perceive ambiguous facial expressions as happy, in healthy volunteers. A single administration of citalopram or reboxetine is sufficient to increase recognition of happy faces. Administration of citalopram or reboxetine results in associated neural alterations, including attenuated amygdala responses to aversive stimuli, increased fusiform responses to happy vs neutral facial expressions, and increased respectively decreased frontoparietal activity during classification or recognition of positive vs. negative self-descriptive words.

Important note is that all these effects of antidepressants on AP occur in the absence of a significant elevation in subjective mood.

Single administration of reboxetine restores normal balance of positive to negative processing in facial expression recognition and recall of self-descriptive words in MDD patients in the absence of any changes in subjective mood. An 8-week-treatment of an SSRI in MDD patients normalizes hyperactive amygdala, ventral striatal, and frontoparietal cortical responses to negative affective information. 8-week-treatment of an SNRI also normalizes hyperactive anterior cingulate responses to negative affective information.  Repeated SSRI treatment can also normalize hyperactive responses to positive affective information.

In a subchronic treatment, citalopram was found to reduce ventral striatal and medial orbitofrontal activity in response to reward, while reboxetine resulted in the opposite.  Two-week treatment of SNRI and SSRI treatment found SNRI to enhance reward-related ventral striatal activity in healthy volunteers, while SSRI diminishes neural processing of both aversive and rewarding stimuli resulting in general dampening of negative and positive affective experiences. The latter provides a plausible neural mechanism for the emotional blunting phenomenon observed in some MDD patients during treatment.

Can the model explain the delayed onset of clinical effect and symptomatic relief?

Acute antidepressant treatment can produce early unconscious changes requiring interaction with the social environment in AP prior to mood improvement - subjective delay in response to antidepressants.

The requirement for changes in NABs and interaction with social environment helps explain some variance in clinical response to antidepressant treatments. Treatment resistant MDD patients tend to have long-standing NABs or social environments that don’t allow an improvement in mood. Therefore, changes in AP could account for the global resolution of the multiple syndrome domains in MDD. A reduction in the preoccupation with negative information would improve mood, reduce social withdrawal, and increase resources for other cognitive tasks. A range of antidepressants decrease submissive and increase affiliative problem solving behaviours in healthy volunteers, demonstrating that reduced NABs are translated into improved social interactions/behaviour.

Clinical applications

CNM suggests the possibility to predict the effectiveness of a certain antidepressant for a certain individual in the long-term from the magnitude of the initial effects of the antidepressants on behavioural or neural measures of affective processes, like the magnitude of the increase in recognition of happy facial expressions produced by a single dose of citalopram or reboxetine.

Since AP appears to be altered by a range of antidepressants with different neurochemical actions, behavioural and neural measures of AP could be a useful predictor of the therapeutic potential of a new antidepressant. To be effective, the agent must be able to alter affective processes.

Processing measures can be used in the development and efficacy of screening novel antidepressants and the tailored treatment to individuals.

Access: 
Public
Article summary with Risk, resilience, and gene x environment interactions in rhesus monkeys by Suomi - 2006

Article summary with Risk, resilience, and gene x environment interactions in rhesus monkeys by Suomi - 2006

Introduction 

There are differences between all individuals in the way they respond to acute/chronic stress. Many individuals who experience traumatic events exhibit only minimal effects on their biological, psychological, and emotional functioning. However, some individuals consistently respond to even the slightest changes in their physical or social environment with profound emotional, psychological, and physiological distress that often reappears without any obvious subsequent provocation.

What factors underlie these differences in responses to stress? Are they due to individual genes, their environment, or an interaction of both? This review tries to answer those questions based on rhesus monkeys and discusses some implications of those findings.

20% of monkeys these findings are based on, that grew up in a naturalistic setting, consistently react to novel, mildly stressful social situations with unusual fearful and anxious-like behaviour, accompanied by prolonged hypothalamic-pituitary-adrenal (HPA) axis activation. 

5-10% of the monkeys growing up in the same conditions are likely to exhibit impulsive and/or inappropriate aggressive patterns of behavioural response under similar circumstances. Monkeys in the latter subgroup also show chronic deficits in serotonin metabolism. 

Development of Individual Differences in Stress Reactivity

Monkeys that grow up in a naturalistic setting spend most of their first month in intimate physical contact with their mother, establishing a strong and enduring bond between mother and infant. From the second month the infant starts to explore their immediate environment, using their mother as a secure base. Later, they spend increasing amounts of time away from their mothers and begin to establish relationships with other (same-aged) members of their social group. Through the rest of their childhood they spend any hours in active social play with these peers. Nearly every social behaviour important in adulthood is developed, practiced, and perfectured during peer play, most notably behaviours leading to successful sexual reproduction and the socialization of aggression (appears at 4-6 months of age).

Excessively aggressive and/or fearful monkeys show significant deviations from their species-normative pattern of social development. Fearful infants leave their mothers to explore their environment at a later age and exhibit low rates of exploratory behaviour later on. They seem reluctant to interact with monkeys other than their mother, resulting in less peer play. When physically separated from their mothers they will exhibit excessive behavioural distress, accompanied by higher and prolonged levels of cortisol which are predictive of differential responses to other situations later in life. 

Overly aggressive infants, especially males, typically display their aggressive tendencies initially in the context of social play with peers. They readily respond to play invitations and tend to initiate rough-and-tumble play bouts, often escalating into episodes of actual physical aggression with their play partners. Other monkeys in their social groups start avoiding most interactions with these aggressive youngsters, resulting in increased social isolation. Studies reveal a significant negative relationship between the incidence of aggression in the context of play and 5-HIAA concentrations; the most aggressive males tend to have the lowest CSP 5-HIAA levels. 

Heritability analyses have shown that individual differences in behavioural and adrenocortical response to separation have a significant genetic component, as well as that differences in CSF 5-HIAA are remarkably stable over the lifespan which are also highly heritable. 

Effects of differential social Rearing Environment

In the same rhesus monkey colony, other monkeys have been reared from birth on in the absence of any adults, but in continuous presence of same aged peers, after an initial month in the researchers’ neonatal nursery. After 6 months of peer-only (PO) rearing, infants are introduced in large social groups containing same-aged PO-reared and mother-and-same-aged(MP)-reared monkeys. Both PO- and MP-reared monkeys remained in the same group until puberty.

PO-reared monkeys quickly develop strong attachment-like bonds with each other within days after release from neonatal nursery. However, these ‘hyperattachments’ tend to be nonfunctional, largely because a peer isn’t as good as mother in providing a secure base for exploration or soothing an infant when it becomes frightened/upset. As a result, PO-reared infants tend to explore and play less compared to MP-reared infants in their first 6 months. PO-reared monkey groups also show more extreme behavioural and adrenocortical responses to social separation at 6 months of age. They display similar behavioural and serotonergic characteristics that differentiate overly aggressive MP-reared monkeys from others in their cohort. Potential reason for this is that they may experience play deprivation though they’re in continuous presence of potential playmates, as they grow older they become more aggressive, more than most of their MP-reared group members. 

Early social experiences like maternal deprivation can have significant and long-lasting effects on behavioural and biological development over and above any contributions to individual differences attributable to heritable factors. 

GxE Interactions

Rhesus monkeys have the same 5-HTT gene and functional polymorphism as humans. As a result of genotyping the monkeys from the colony, researchers have been able to search for possible GxE interactions involving differential early experience and allelic variation in the 5-HTT gene. Monkeys carrying the short allele of this gene show delayed early neurobiological development, impaired serotonin functioning, and excessive aggression, HPA reactivity and alcohol consumption as they grow up, only if they have been PO-reared, MP-reared carrying the same short allele exhibit species normative patterns. MP-reared monkeys with the same short allele consume less alcohol than their MP-reared counterparts carrying the long allele, raising the assumption that having the short allele represent a significant risk factor for PO-reared, while the opposite for MP-reared monkeys is true. Though it seems apparent that significant GxE interactions occur in monkeys and humans, the demonstration of such interactions has been largely statistical and hence subject to multiple interpretations. When talking about a good gene protecting an individual from a bad environment, MP-rearing buffers individuals carrying less efficient allele from developing the aberrant patterns shown by PO monkeys with the same allele - a good environment can protect people from carrying a bad gene from detrimental developmental outcomes.

These two competing interpretations of the same data sets are not necessarily mutually exclusive. Following the author of the article, different developmental processes representing both interpretation can take place in the same individual during the same periods of development. New research demonstrates that different maternal licking/grooming of rat pups during their second postnatal week can alter gene expression with consequences that are life-long and can be transmitted to the next generation of offspring. Given the findings, the possibility that specific early social experiences can similarly alter gene expression in primates no longer seems far-fetched. If true, this could have implications for the prevention of adverse outcomes in individuals carrying the less efficient allele of these and other ‘candidate’ genes.

Summary

Rhesus monkeys exhibit individual differences in their reactions to environmental stress like humans. Some are excessively fearful in response to changes in their environment throughout development; others are overly impulsive/aggressive. Its possibly to identify genetic and environmental factors that contribute to these different response patterns, but recent evidence suggests that GxE interactions may be at least as important in shaping individual development in this species, possibly through mechanisms by which specific aspects of the environment influence expression of certain genes at certain times of development. Lastly, because GxE interactions require genetic variation at the species level to take place, the fact that rhesus monkeys - and humans - apparently possess greater allelic variability in certain genes than other primate species may contribute to their remarkable resilience and adaptive success relative to other primates.

Access: 
Public
Article summary with Early life adversity and the epigenetic programming of hypothalamic-pituitary-adrenal function by Anacker a.o. - 2014

Article summary with Early life adversity and the epigenetic programming of hypothalamic-pituitary-adrenal function by Anacker a.o. - 2014

Purpose: We review studies with (non)human species that examine the hypothesis that epigenetic mechanisms, particularly those affecting the expression of genes implicated in stress responses, mediate the association between early childhood adversity and later risk of depression.

The quality of family life influences the development of individual differences in vulnerability for affective illnesses. Victims of childhood abuse or parental neglect are at a greater risk for affective disorders (association found between adverse childhood experiences and the risk for depression). Mediators in this relationship: personality (neuroticism), low self-esteem, conduct disorder, increased risk of adverse life events, low social support, difficulties in interpersonal relationships. Other variables compromising cognitive and emotional development and increasing risk of depression/anxiety disorders to a level comparable to that for abuse: familial dysfunction/conflict, persistent neglect, cold, distant parent-child relationships, harsh and inconsistent discipline. But family life can also be a source of resilience against chronic stress: warm, nurturing families promote resistance to stress and diminish vulnerability to stress-induced illness. The epidemiology of affective disorders reflects the influence of family life on neural development and mental health.

This review gathers evidence linking variations in parental care to the epigenetic mechanisms regulating a specific phenotype, stress reactivity, that influences vulnerability for many forms of psychopathology.

Parental influences on stress reactivity 

The relationship between the quality of early social environment and health in adulthood seems to be partly mediated by individual differences in neural systems underlying the expression of behavioural and endocrine responses to stress. Physical and sexual abuse in early life increases endocring and autonomic responses to stress in adulthood. The influence of familial depressive illness is mediated by increased stress reactivity, enhancing the response of the individual to mild/regular stressors. Individuals with early adverse experience appear to be sensitized to the depressive effects of acute stress in adulthood.

Hypothalamic-pituitary-adrenal axis function and Major Depressive Disorder

Hypothalamic-pituitary-adrenal (HPA) axis activity is governed by the secretion of corticotropin-releasing factor (CRF) and vasopressin (AVP) form the paraventricular nucleus of the hypothalamus, in turn activating secretion of the adrenocorticotrophic hormone (ACTH) from pituitary corticotropes, which tne stimulates the synthesis and secretion of glucocorticoids (cortisol in humans and other primates, and corticosterone in rodents) from the adrenal cortex. Cortisol/corticosterone bind to and activate glucocorticoid (GR) and mineralocorticoid (MR) receptors in multiple target issues including brain regions that influence hypothalamic synthesis of CRF and vasopressin, regulating HPA activity. Activation of GR in particular leads to negative feedback inhibition of hypothalamic CRF and AVP from the hypothalamus and directly on secretion of ACTH from pituitary corticotropes.

Humans have a daily pattern of HPA activity: cortisol levels rise later in the night, further rise 30 minutes after waking up, followed by a gradual decline over the day.

A depression shows elevated basal levels of ACTH and cortisol. Difference in basal levels of cortisol shown to be greatest during the afternoon when cortisol should be falling. The increase in these levels is marked in melancholic and psychotic depression, but not in atypical depression. Since elevated cortisol acts to inhibit subsequent HPA activity, it’s suggested that the glucocorticoid negative feedback is impaired in depressed patients. Successful antidepressant treatment is associated with resolution of the impairment in the negative feedback on the HPA axis by glucocorticoids. Normalization of glucocorticoid function by antidepressants is a significant predictor of long-term clinical outcome.

Enhanced HPA activation is also apparent at the level of the brain: higher levels of CRF/CRF-expressing neurons in postmortem samples from depressed compared to nondepressed individuals. CRF1 receptors are reduced in postmortem brains from suicide victims, many who showed a history of depression.

While increased HPA activity among depressed patients isn’t universal, it’s of clinical significance.

Developmental adversity in HPA function

Childhood maltreatment is associated with increased HPA response to stress. Childhood abuse was the strongest predictor of increased HPA activity (i.e. ACTH responsiveness), followed by abusive events, adulthood traumas, and depression. 

CRF concentrations correlate with the severity/duration of physical and sexual abuse. High CRF may arise due to GR down-regulation and impaired negative feedback inhibition. 

Childhood adversity influences HPA responses to stress. It also moderates the relation between stressful life events in adulthood and depression, with increased risk for depression or anxiety in response to moderately stressful circumstances among individuals with a history of childhood adversity.

Developmental adversity and epigenetic regulation of HPA function

This study explores potential mechanisms for parental effects examining the influence of variations in maternal care in rats on the development of individual differences in behavioural and endocrine responses to stress.

Lactating female Long-Evans rats show considerable variation in the frequency of pop licking/grooming (LG). Individual differences in frequency of pup LG among adult female rats are reliable across litters, and so a stable feature of the maternal phenotype. It’s found that variations in pup LG over the first week of life affect the development of behavioural and HPA responses to stress in adulthood.

There are differences in HPA responses to acute stress apparent in circulating levels of ACTh and adrenal corticosterone. As adults, the offspring of high-LG mothers show more modest plasma ACTH and corticosterone responses to acute stress compared to low-LG offspring. Offspring of high-LG mothers show increased hippocampal GR mRNA and protein expression, enhanced glucocorticoid negative feedback sensitivity and decreased hypothalamic CRF mRNA levels.

Effects of maternal care on gene expression and stress responses are reversed with cross-fostering: stress responses of animals born from low-LG mothers and reared by high-LG mothers are comparable to normal offspring of high LG mothers, and vice versa. Studies show that animals brushed for 15min/day increased hippocampal GR expression - revealing a direct relation between maternal care and phenotypic development of offspring. 

Tactile stimulation from maternal licking appears to be the critical environmental signal for the regulation of hippocampal GR expression in newborns. Maternal effects on hippocampal GR expression are mediated by increases in hippocampal serotonin (5-HT) turnover and expression of the nerve-growth factor-inducible factor-A (NGFI-A) transcription factor. Maternal licking increases hippocampal expression of the transcription factor (NGFI-A).

The 5’ non-coding variable exon 1 region of the hippocampal GR gene contains multiple alternate promoter sequences including a neuron-specific, exon 1 sequence. Increased pup LG enhances hippocampal expression of GR mRNA splice variants containing exon 1 sequence, which contains an NGFI-A response element. Pup LG increases hippocampal NGFI-A expression and binding to the exon 1 promoter. 

There is a similar effect on hippocampal Gad1, an NGFI-A regulated gene that encodes for glutamic acid decarboxylase, the rate-limiting enzyme for GABA synthesis. The association of NGFI-A with the Gad1 promoter is increased in the offspring of high compared with low LG mothers, but only following a nursing bout. This finding suggests that maternal care regulates the expression of a range of NGFI-A-sensitive genes. 

Mechanism by which hippocampal GR expression remains elevated following weaning and separation from the mother: one possibility is that the increased NGFI-A-exon1 interaction occurring within hippocampal neurons in the pups of high LG mothers might result in an epigenetic modification of the exon1 sequence that alters NGFI-A binding and maintains the maternal effect into adulthood. The initial studies were focused on potential influences on DNA methylation. 

Preliminary studies revealed greater methylation across the entire exon1 GR promoter sequence in the hippocampus of adult offspring of low LG mothers. These findings suggested a parental effect on DNA methylation patterns in the offspring. The effect of maternal care involves significant alterations in the methylation status of the NGFI-A site.

These reusults suggest a direct relation between maternal care, histone acetylation, DNA methylation of the GR-1 promoter, GR expression and HPA responses to stress. 

Variations in parent-offspring interactions epigenetically program hippocampal GR expression and thus the nature of the HPA response to stress. However, subsequent studies reveal effects of early experience on multiple components of the HPA axis, and in each case there is evidence for stable epigenetic programming. 

Environmental conditions that increase frequency of pup LG in the rat are associated with decreased paraventricular CRF expression. This maternally regulated decrease in CRF expression is accompanied by an increase in hippocampal GR expression. However, the difference in CRF expression develops independent of GR regulation, since CRF expression occurs earlier in development than does the difference in hippocampal GR expression. This is not surprising since hippocampal-mediated negative feedback emerges only about the time of puberty in the rat. 

Maternal regulation of HPA function extends to the level of the pituitary. Maternal separation of neonatal mice produces an enduring hypomethylation of the Pomc gene, which encodes for the ACTH pro-hormone, proopiomelanocortin, increased Pomc mRNA expression and increased basal and CRF-induces levels of ACTH. 

The quality of postnatal maternal care epigenetically programs gene expression at multiple levels of the HPA axis to regulate both basal and stress-induced activity. 

Epigenetic regulation of glucocorticoid receptor expression in humans 

They found decreased hippocampal GR expression in samples from suicide completers with histories of childhood maltreatment compared with controls (sudden, involuntary fatalities).

The exon1 sequence shows increased DNA methylation and decreased NGFI-A binding in samples from suicide victims with a history of maltreatment. This finding bears considerable similarity to the maternal effect in the rat and are suggestive of early-environment regulation of the neural epigenome in humans. 

Forebrain GR activation inhibits HPA activity through tonic negative-feedback inhibition. Thus, selective knockdown of GR expression in the corticolimbic system in rodents is associated with increased HPA activity under basal as well as stressful conditions. Conversely, GR overexpression is associated with a dampened HPA response to acute stress. These findings are consistent with a working hypothesis linking early social environment to epigenetic modifications of the GR gene and expression, and HPA function. 

Childhood maltreatment associates with an increased level of exon 1F methylation. Furthermore, the methylation status of the promoter was closely correlated with both the frequency and severity of maltreatment. 

Summary 

Epigenetic modifications of genes implicated in HPA function are a mediating process that links the quality of childhood experience to the risk for major depression. But there are caveats that limit the degree to which this hypothesis might be applied across the population: altered HPA activity is apparent only in a subset of MDD patients and it is not clear that this subset is defined by developmental history. It is clear that genotype moderates the impact of adverse childhood experiences and resulting epigenetic modifications. 

The majority of variably-methylated regions were best explained by a gene x environment interaction. Unfortunately, with few exceptions, preclinical studies haven’t established models of gene x environment interactions. Likewise, preclinical studies have yet to widely capitalize on the finding that adverse childhood experience alters the response to antidepressant medications. This study provides a basis for studies of the mechanism by which developmental history might contribute to treatment resistance in MDD. One question of interest for the development of effective patient stratification is whether epigenetic marks associated with childhood adversity might identify individuals that are resistant to medications. 

A challenge for future research is integrating our knowledge of the importance of childhood experience into treatment models.

Access: 
Public
Article summary with Effects of the social environment and stress on glucocorticoid receptor gene methylation: A systematic review by Turecki & Meaney - 2016

Article summary with Effects of the social environment and stress on glucocorticoid receptor gene methylation: A systematic review by Turecki & Meaney - 2016

Introduction 

The early-life social environment can induce stable changes that influence neurodevelopment and mental health, meaning that they have a persistent impact on gene expression and behavior through epigenetic mechanisms. The hypothalamus-pituitary-adrenal (HPA) axis is sensitive to changes in the early-life environment that associate with DNA methylation of neuro-specific exon 17 of the glucocorticoid receptor (GR). 

DNA methylation is a process by which methyl groups are added to the DNA molecule. Methylation can change the activity of a DNA segment without changing the sequence. When located in a gene promoter, DNA methylation typically acts to repress gene transcription.

In rats, the offspring of mothers with an increased frequency of pup licking/grooming (LG) show increased GR expression, greater negative feedback regulation over hypothalamic corticotropin releasing factor (CRF), and more modest responses to stress compared with offspring of low LG mothers. Variations in maternal LG are linked to an epigenetic modification of a neuron-specific exon 17 GR promoter such that increased maternal LG associates with decreased methylation of the exon 17 promoter and increased hippocampal GR expression. 

In humans, evidence for a long-term effect of early-life adversity (ELA) on the epigenetic state of the human genome was observed while investigating the methylation state of the GR gene in the hippocampus of individuals who committed suicide and had a history of child abuse. ELA in humans reprograms the DNA methylation patterns of the GR gene exon 1F (GR1F) expression in the hippocampus of suicide completers with a history of child abuse compared to non-abused suicide completers and healthy control subjects. Another study has found that children born in mothers with depression, irrespective of SSRI use, exhibited higher GR1F methylation levels. 

The article summarizes 40 articles that either used humans or mammals like mice or rats to find evidence for their hypotheses.

Early life adversity 

Experimental groups and GR region Studied

In animal models, ELA was characterized by either the use of maternal care or maternal separation models. All of the animal studies examined exon 17 when studying the GR region. 

In human studies, ELA was defined as exposure to traumatic events in childhood, including emotional, physical, or sexual abuse, neglect, early parental death, and other traumatic events. The majority of human studies examined exon 1F, one examined exon 1D, while 2 also included exon 1B and 1C and 3 studies examined exon 1H, when studying the GR region.

Effect on Methylation

The ELA group contains results from 10 studies for each animal and human studies. 

7/10 animal studies reported a significant increase in methylation in the exon 17 promoter due to ELA, while 3 studies reported no such change. 9/10 human studies reported increased promoter methylation with ELA.  

Parental Stress

Experimental groups and GR region Studied

One animal study used a variety of non-pain-inducing, non-habituating, stressors over a course of 7 days to stress mouse dams. The study examined exon 17 when studying the GR region. 

Human studies examined the methylation status of children of women who had experienced anxiety and mood disorders (2 studies), pregnancy-related anxiety (1 study), violence (1 study), or war stress (2 studies) during pregnancy. Another study examined the effects of parental stress on GR gene methylation in adolescence, and an additional study examined the methylation pattern of individuals whose parents had experienced war but not during pregnancy. 7/8 studies examined exon 1F, while one study examined exons 1B and 1D, when studying the GR region. 

Effect on Methylation

All 8 studies investigating exon 1F, respectively 17 for animals, reported an increased methylation of exon 1F/17 in offspring of parentally stressed individuals. Of these, 2 studies also reported decreased methylation at specific CpG sites and in particular conditions, such as that the stressor during pregnancy only existed during the first or second trimester of the pregnancy. The one study that was investigating exon 1B and 1D reported a decrease in methylation of exon 1B and an increase in methylation of exon 1D in children of women experiencing fear of delivery in all trimesters or fear of the integrity of the baby in the first trimester. 

Psychological stress/ Psychopathology 

Experimental groups and GR region Studied

3 animal studies used acute or chronic stress models to assess the impact of social stressors on GR gene methylation. All of which examined exon 17 region.

8 human studies examined the effects of psychopathologies on human GR gene methylation. Specifically, one study involved response to stress, a second BPD, a third bulimia nervosa, another two PTSD and an additional two depression. Of which 6/8 examined exon 1F and in addition 3 studies examined exon 1B, and 2 studies examined 1J and 1E. 

Effect on Methylation 

In the animal models, 2/3 studies reported increased exon 17 methylation in stressed animals. 

In human studies, results were more varied. One study reported increased methylation of exon 1 F, while 3 studies reported no change and 2 reported decreased methylation. Among the other exon variants examined, there was no consensus on the effect of psychological stress/psychopathology on methylation status. 

Conclusion

This article reviewed a set of 40 studies that research GR gene methylation in relation to various psychological stressors, including parental stress, ELA in social environments in animals, ELA in humans, and psychological stress or psychopathology in adults, by, in most of the studies, investigating the GR exon variant 1F or 17 for animals. 

The majority of the studies concerning ELA and in utero adversity reported increased methylation at the respective exon. In particular negative early-life social environment were associated with greater exon 1F methylation in the large majority of the studies. These findings show a compelling consensus across studies and methodologies (eg. Tissue samples from different locations were used in different studies, such as brain or blood tissue) Furthermore, childhood maltreatment associates with increased exon 1F methylation and the promoter methylation status was closely correlated with both the frequency and severity of maltreatment. One study reported that increased methylation of the exon 1F NR3C1 gene promoter in leukocytes is associated with the disruption of normal parent-offspring interaction or maltreatment, and linked an attenuated cortisol response. Another study supported this claim and noted that parental loss was also associated with increased methylation on the same gene promoter. 

Recent evidence suggests that epigenetic plasticity is sustained in the brain throughout adulthood, potentially as a mechanism to cope with the evolving demands of the environment, yet, there are clear moments during development when plasticity is heightened and these may be more strongly associated with the establishment of life-long epigenetic modifications. 

Studies with adults revealed that childhood maltreatment is associated with an increased HPA response to stress. Subsequent statistical analyses revealed that childhood abuse was the strongest predictor of adrenocorticotropic hormone (ACTH) responsiveness, followed by the frequency of abusive events, adulthood traumas, and depression. An interaction term of childhood and adulthood trauma proved to be the best predictor for ACTH responses, suggesting that a history of childhood abuse is related to increased stress reactivity, which is further enhanced when additional trauma occurs in adulthood. 

Cerebrospinal fluid CRF concentrations were correlated with the severity and duration of physical and sexual abuse, and high CRF may arise due to GR downregulation and impaired negative feedback inhibition. Some subjects having experienced childhood abuse exhibit a decreased cortisol production, which may reflect an adaptation to chronically stressful situations, whereas elevated cortisol production may prime individuals to react to unpredictable stressors, and these situations may both constitute ELA. Currently, it is difficult to draw conclusions  on the overall impact of GR methylation variations on basal and reactive cortisol levels, as the majority of studies investigating GR promoter methylation did not measure cortisol levels. 

These findings suggest that childhood adversity stably influences HPA responses to stress, which is consistent with the idea that childhood maltreatment sensitizes neural and endocrine responses to stress, thus establishing a vulnerability for mood disorders. 

Recent studies suggest that epigenetic programming of HPA function occurs at multiple levels of the HPA axis in addition to effect on hippocampal GR expression. 

Environmental conditions that increase the frequency of LG in the rat are associated with decreased paraventricular CRF expression. Augmented maternal care in return, reduced the number of excitatory synapses onto CRF neurons. Decreased CRF, through disruption of maternal care in mice, showed enhanced glutamatergic transmission to hypothalamic CRF neurons in offspring. Moreover, prolonged periods of maternal separation alter the methylation state of the promoter for the Avp gene, increasing hypothalamic arginine vasopressin synthesis and HPA responses to stress. These findings reveal that the quality of postnatal maternal care in rodents epigenetically programs gene expression at multiple levels of the HPA axis to regulate both basal and stress induced activity. 

A remarkable feature of all the findings is the coordinated epigenetic effects on multiple genes, in multiple tissues, that collectively serve to increase HPA responsivity in stress responses in early social adversity. 

Access: 
Public
Article summary with Bias for the (un)attractive self: On the role of attention in causing body (dis)satisfaction by Smeets a.o. - 2011

Article summary with Bias for the (un)attractive self: On the role of attention in causing body (dis)satisfaction by Smeets a.o. - 2011

Body dissatisfaction’s been classified as a main diagnostic characteristic of eating disorders. Individuals with eating disorders suffer from severe feelings of unattractiveness, fatness, and characterized by intense body loathing. Cognitive models explain eating disorder symptoms in terms of maladaptive knowledge structures that bias the processing of disorder-relevant information. One type of cognitive bias that’s extensively studied is attentional bias (AB) – the tendency to selectively attend and give priority to disorder-relevant information.

Research shows eating-disorder patients to show an attentional bias for body-and food-related information and specific body parts. Study results showed that eating disorder patients had a dysfunctional way of looking at their own bodies vs a control body. When attending to their own, patients attended more to their self-defined unattractive body parts, whereas healthy people attended more to their own attractive body parts. This pattern was reversed when patients were exposed to another body: patients looked at another body’s attractive parts, healthy people looked at another person’s unattractive parts.

Data suggests that a tendency for selectively attending to one’s own unattractive body parts may maintain or cause severe feelings of body dissatisfaction. This study will experimentally test the role of selective attention for (un)attractive body parts in body (dis)satisfaction to determine whether bias is also causal to (dis)satisfaction.

Study 1

A negative or positive body image bias was temporarily induced, by presenting participants with a picture of their own body and training them to selectively attend to either their three self-defined most unattractive body parts (negative bias training) or their three most attractive body parts (positive bias training).

Hypotheses: a) negative bias training will induce a decrease in body and weight satisfaction, positive bias training will induce an increase in body and weight satisfaction, b) the positive counter induction training will repair body and weight satisfaction.

Method

Participants: Total of 47 undergraduate female students and randomly assigned to either the positive or negative bias training.

Materials and assessment:

  • Pictures: pictures taken of participants in their underclothes.
  • Individual stimulus selection: participants fill out a questionnaire selecting body parts for the individually tailored bias trainings. Shown a picture of their own body and asked to name and rank all body parts marked from most (10) to least attractive (1).
  • Apparatus: participants’ eye movements were recorded. Eye movements were only registered as part of the gazecontingent manipulation that was used – not relevant for the hypotheses.
  • Satisfaction and mood: body shape questionnaire used to select participants with a moderate level of body dissatisfaction.
  • Attentional bias induction: to induce a temporary attentional bias for the participant’s selfdefined attractive or unattractive body parts, they developed an individually tailored attentional bias induction training. During the training participants were instructed to detect and identify the nature of probes appearing at different locations on a fuzzy background picture of the participant’s body and three neutral objects in the periphery.
  • Manipulations: in both conditions the probe appeared on 10% of trials on one of three neutral objects located outside the body. In the negative bias training it appeared on 90% of trials one of the three most unattractive body parts, in positive bias training it appeared on 90% of the trials on one of three most attractive body parts. Negative bias condition participants were given an additional positive bias training of ten minutes serving as a positive counterinduction training.

Procedure: all participants were tested individually. First session – participant photographed in their underwear, then taking the questionnaire to select body parts for the bias training. Second session – a week later, the experimenter inserting the participants self-defined attractive and unattractive body parts in a computer programme. Participant given a range of visual analogue scales measuring body/weight satisfaction, and mood. The eye tracker was placed on the participant’s head as she sat in front of the computer. Participants thought they were taking part in a visual discrimination task. At the end of the 20-minute training, body and weigh satisfaction and mood were measured again. All participants were debriefed in writing after the experiment ended.

Discussion

This study aimed to experimentally test the causal role of selective attention for (un)attractive body parts in the induction of body (dis)satisfaction. It investigated whether training healthy participants to selectively attend to their most unattractive body parts would lead to increased feelings of body/weight dissatisfaction, and if training to attend to their three most attractive body parts would lead to increased feelings of body/weight satisfaction. But the positive counterinduction training did increase body satisfaction in those first assigned to the negative bias induction training. Results also show that participants showed a decrease in mood after bias training – this decrease was smaller for people in the positive condition. This shows that inspecting a picture of one’s own body for 20 minutes negatively affects one’s mood, but when being trained to attend to one’s attractive body parts this effect is less severe than after attention training in unattractive body parts.

The hypothesis that temporarily inducing selective attention for self-defined attractive body parts would lead to increased feelings of body satisfaction as only partly supported. Positive training in healthy women didn’t induce more satisfaction – suggests that it might in general be more difficult to induce positive than negative biases, not just in reasonably satisfied women.

Support was found for the idea that positive training leads to increased body satisfaction: it worked for the group that first received a negative body image bias induction and, consequent of that training, showed a recent significant decrease in body satisfaction. These women had more room for improvement than those assigned to the positive training without preceding negative bias induction, but a more crucial difference may be that their relative dissatisfaction as recently induced and therefore not that tenacious.

Finding that a temporary decrease in body and weight satisfaction could be reversed by training participants to attend to their most attractive body parts shows that changes in a positive direction are possible.

Study 2

To investigate whether the positive bias training will be effective in a sample of women with higher levels of body dissatisfaction, a second study was conducted. Effects of positive bias training were compared with effects of random exposure to the body.

Participants were presented with a picture of their own body, and were trained to either attend to their three most attractive body parts (positive bias training) or to attend to all their body parts (control training). Hypothesis: positive bias training will induce an increase in body/weight satisfaction, whereas the control training won’t lead to changes in body/weight satisfaction.

Method

Participants: 21 female undergraduate students selected on the basis of high score on BSQ invited to participate in a study ostensibly testing relation between perception and concentration. Participants randomly assigned to positive training or control training.

Materials and assessment:

  • Pictures: same as study 1
  • Individual stimulus selection: select body parts for the individually tailored bias training, participants asked to fill in ‘perception and concentration ranking questionnaire’ – same as study 1.
  • Apparatus: participants eye movements registered.
  • Satisfaction and mood: same as study 1.
  • Attentional bias induction: procedure of positive attentional bias induction similar as study 1, exception that the present training was 15 minutes longer than the previous one. Effects of positive training were compared with effects of control training.
  • Manipulations: in both conditions the probe appeared in 10% of the trials on one of three neutral objects located outside the body. In positive training, probe appeared on 90% of trials on one of three attractive body parts, in control training it appeared on 90% of trials randomly on all body parts, except for the participants most attractive and unattractive body parts.

Procedure: procedure exactly the same as study 1.

Discussion

The second study aimed to investigate if positive bias training increased body/weight satisfaction in women highly dissatisfied with their bodies, compared with control training. Hypothesis supported – shown that training highly dissatisfied women to selectively attend to their three attractive body parts led to increased feelings of satisfaction. Conversely, being trained to attend to all body parts didn’t lead to any changes in body/weight satisfaction. Results demonstrated that participants showed a decrease in mood after training – possibly due to repetitive or ‘boring’ nature of the training.

Results show that focusing on one’s own attractive body parts during a 35-miute training session might be a promising intervention for improving women’s body image.

General Discussion

These studies show that how women attend to their bodies can cause them to feel better or worse about themselves. It’s shown that training healthy participants to focus on their more unattractive body parts caused them to feel less satisfied, whereas training highly dissatisfied participants to focus on their more attractive parts cause them to feel more body satisfaction.

Generally, findings provide support for the causal role of selective attention in body (dis)satisfaction. As eating-disorder patients have been shown to selectively attend to their own unattractive body parts, it may be suggested that repeatedly attending to these parts might cause and maintain feelings of body dissatisfaction.

Results of study 2 show that changing the way women dissatisfied with their bodies attend to their bodies leads to positive changes in the way they feel about them.

Training dissatisfied women to attend to their most attractive body parts proved preliminary to be a new technique for improving feelings of body dissatisfaction. However, exposing dissatisfied women to a picture of their own bodies without training them to attend to their bodies in a specific way did not lead to any changes in the way they felt about their bodies – could also be due to the relatively small sample size.

These findings may have implications for the use of body exposure therapy in treatment of eating-disorders.

Finally, though findings are still preliminary and in need of replication, there’s reason to believe that treatment programs may benefit from incorporating procedures to teach eating-disordered and other patients dissatisfied with their bodies to focus more on their beautiful body parts. But a main empirical question remains – does training eating-disorder patients to selectively attend to their most beautiful body parts lead to an increase in body satisfaction. Essentially, we’d want to address this hypothesis in a group of patients who’re at the end of their treatment programs – too risky to expose patients at the beginning of their treatment programs to this kind of training as it may reinforce the glorification of their skinny bodies.

In conclusion, the present findings provide support for the etiological significance of biased attentional processes in body dissatisfaction and provides a new and simple way for improving women’s body image.

Access: 
Public
Article summary with Approach bias modification in alcohol dependence: Do clinical effects replicate and for whom does it work best? by Eberla a.o. - 2013

Article summary with Approach bias modification in alcohol dependence: Do clinical effects replicate and for whom does it work best? by Eberla a.o. - 2013

Introduction

Alcoholism is a progressive neurocognitive disorder, where premorbid vulnerability factors interact with neuroadaptations resulting from excessive drinking. Lack of cognitive control over impulses can be a risk factor for and a consequence of addition, with increased evidence indicating that the latter maybe strong when excessive substance use starts in adolescence. Recent research emphasizes the importance of the malleable adolescent brain in the early development of addiction (note: clinical patients typically adult, with large age-range). Some of the later stages of addiction usually occur later in life. But some of the neurocognitive abnormalities in addiction may reverse after prolonged abstinence or targeted training.

Biomedical research emphasizes that addiction should be seen as a chronic brain disease, but this shouldn’t lead to the conclusion that nothing can be done about it. In addition to traditional treatments, a novel set of training-paradigms have been developed, known as Cognitive Bias Modification (CBM). These have generated positive first results in the treatment of problem drinking in a community sample, and alcoholism in patients. These interventions aim to directly change the automatic/implicit cognitive motivational processes involved in addiction, of which patients may not always be aware and are difficult to control/change by more traditional means. This study focuses on re-training one of these implicit processes: automatically triggered action tendency to approach alcohol.

A first clinical study on the effects of CBM tested effects of four sessions of A-AAT (alcohol approach-avoidance task) training in alcohol-dependent patients in an inpatient setting. Results showed patients’ alcohol-approach bias to change to alcohol-avoidance bias – generalizing to untrained pictures in the same task. Patients in the training group showed 13% less relapse in one year after treatment compared to the control group.

Little is known about predictors of successful training. Various studies found implicit cognitive processes to be a better predictor of alcohol use in people with relatively weak executive control (EC) capacities. Recent study found that EC moderates the relation between alcohol-approach tendencies and drinking behaviour in high-risk adolescents and may also moderate training effects (stronger effects in those with weak EC). One can conclude that approach-bias re-training may work well for patients with weak EC and so we should use a classic Stroop colour-word interference task as a potential moderator. Deviant brain activation during a Stroop task has been associated with risk for addiction in adolescents with family history of alcoholism, and children with ADHD when they’re off medication.

This study has two main objectives: 1) to test if the effect of adding computerized approach-bias re-training to cognitive behavioural treatment increases abstinence in alcoholic inpatients, and whether the effect on treatment outcome would be mediated by the amount of change in approach-bias. 2) to investigate whether success of training can be predicted by patients’ level of EC and/or background variables.

Methods

Participants

Alcohol-dependent patients administered to a three-months inpatient treatment. We included every patient with a primary alcohol dependency diagnoses. Exclusion criteria were neuro-cognitive problems, strong withdrawal symptoms, history of schizophrenia, and visual or hand-motoric handicaps. None of the patients received anti-craving medications. The final analytical sample consisted of 475 patients.

Assessment and outcome measures

Patients were diagnosed at intake using the computerized version of the CIDI (Composite International Diagnostic Interview) and a diagnostic interview. Both the CIDI and interview ere the basis for the final expert ratings on diagnoses made by clinical psychologists.

Experimental Task

Alcohol-AAT (Approach-Avoidance Task)

The alcohol AAT measures the automatic approach tendency toward alcohol. Participants asked to react to the format of pictures using a joystick, ignoring the pictures’ contents. Two categories of pictures – 20 alcoholic beverages, and 20 soft drinks. Pushing a picture away would decrease its size, and pulling a picture closer would increase its size.

Alcohol- AAT (Approach-Avoidance Task)-Training Version

For CBM, A-AAT was used. Training effect was achieved by presenting alcohol pictures in the push-away format and soft-drink pictures in the pull-closer format.

Colour Stroop (EC)

Variety of the Colour Stroop task was used. To assess strength of inhibitory EC in alcohol-dependent patients. Participants had to decide whether the colour name and colour of the ink were same or different.

Questionnaires

Beck Depression Inventory (BD): Used to measure the severity of depressive symptoms.

Rosenberg Self-Esteem Scale (RSES): Addresses feelings of global self-worth.

Symptom Checklist 90-R (SCL90-R): Measures the physical and psychological impairment of a person within the past seven days. Indicates general level of distress.

Alcohol Abstinence Self Efficacy Scale (AASE): Assesses a patients’ confidence to stay abstinent in 20 different situations as well as their temptation to drink in these situations.

Alcohol Use Disorders Identification Test (AUDIT): A screening instrument for problematic alcohol consumption, constructed by the WHO.

Conditions and Experimental Manipulations

Participants randomly assigned to one of the two groups (training vs no training). Experimental group received 12 sessions to respond with an avoidance movement to alcohol pictures and approach movement to non-alcohol drinks. Achieved by presenting alcohol pictures in landscape format and soft drinks in portrait. During training, participants had to correct errors. Each training sessions started with a short A-AAT assessment to measure training effects of the previous session. Control group received no training at all instead of a sham-training, because no significant difference between no-training and sham-training was found in a previous study.

Procedure, study design, data analysis

First week of therapy, patients took part in a ‘neuropsychological checkup’ including the A-AAT. The previously mentioned questionnaires were also filled out, within the diagnostic phase. After pretest, patients randomly assigned to training conditions. As well as experimental manipulation, patients received treatment as usual, consisting of abstinence-oriented inpatient CBT-based treatment. One year after discharge, patients received a follow-up questionnaire about alcohol consumption since treatment.

The main outcome variable was treatment outcome one-year follow-up. Successful outcomes were 1) no relapse at all or 2) a single lapse shorter than 3 days ended by the patient without further negative consequences. ‘No success’ was defined as relapse or death, ‘no information’, or refusal.

To predict who will profit from the training, the predictive value of the implicit measure (A-AAT) and the questionnaires, as well as demographic facts, and the measure of EC were of interest.

Discussion

Main findings were that the effects of a computerized alcohol-avoidance training were replicated on the process trained and on long-term clinical outcomes. Second, this long-term effect was mediated by the change in alcohol-approach tendencies from pre- to post-test. Third, regarding moderators, the strength of alcohol-approach tendencies at pretest, but not hypothesized weakness of EC predicted the amount of change in approach-tendencies – age significantly predicted who would profit most from training.

Clinical effects of CBM in short-term and long-term were replicated. Strengthens the suggestion to add CBM to regular treatment, to supplement treatment of addictive disorders. Short-term effects were mainly due to the experimental group developing strong alcohol-avoidance bias, whereas no change in bias for soft drinks occurred. For the control group there was no change in either drink. Long-term alcohol-dependent patients are likely to feel approach (negative reinforcement effect of alcohol) and avoidance (awareness of negative long-term effects of drinking) impulses. But despite this awareness, salient situational cues may still trigger approach-tendencies in a patient in a risky situation after leaving the clinic. CBM could reduce ambivalence by strengthening avoidance impulses, weakening tendencies, and/or control over these tendencies.

Regarding long-term effects, fewer relapses occurred in the training group, and the training condition remained a significant predictor of treatment outcome. Could be argued that training only increased patients’ willingness to answer follow-up questionnaires, maybe because they liked the training/had positive memories of their stay at the clinic (social desirability). But an increased return rates in the sham-training condition was not found. It seems likely that the lack of significance is due to response bias indicating that patients are more likely to answer if they’re still abstinent.

Regarding moderation, amount of change in alcohol-approach bias was moderated by the A-AAT prescore. It makes intuitive sense that re-training of a cognitive bias has the strongest effect in participants who start the training with a strong cognitive bias. But present findings also indicate that matching on the individual level will be difficult. Hypothesis that weak EC would predict stronger effect of alcohol-avoidance training wasn’t confirmed. Was found that age moderated the treatment effect, and higher age is correlated with weaker EC.

From a broader developmental perspective, the present findings are interesting. They reject the notion that CBM can only have an influence on young malleable brains. More positive effects were found for older alcoholic patients. These findings are promising from a treatment perspective. Many individuals experiencing problems with alcohol spontaneously ‘mature out’ of these problems, also in adult community samples many individuals succeed in spontaneous recovery without formal treatment. People who enter formal treatment tend to be older and have comorbid other psychiatric problems. CBM may train control over the impulse to drink again in a concentrated way, and the alcohol-stimuli may help trigger this ability in risky situations after treatment discharge. One could also investigate training of general EC abilities. Goal-management and mindfulness training have provided initial promising results in addiction, also interventions with a cognitive training element.

Finally, though these results are promising, there are some limitations.

  1. Measure of EC used here was a variety of the Stroop Task, may have been easier than the original.
  2. Brain activity during the task could also be measured.
  3. Both the current study and the one by Wiers et al. (2011) were conducted in the same clinic and carried out by the same research group. To draw firm conclusions about the use of this CBM as a treatment it needs to be tested in a multi-center study with different treatment facilities.
  4. Success of the CBM treatment may depend on the motivation of patients.
  5. Important line of future research is to identify necessary conditions for the CBM to be effective.
  6. A related domain of further applied interest concerns the possibility of a ‘take-home’ training. CBM is simple and computerized, so patients could continue their training outside the clinic at home – may increase the transfer of success achieved with training.
  7. Another issue concerns mediation and moderation. Present finding that age moderated the effectiveness of the A-AAT retraining is valuable when it comes to matching trainings to patient. But age correlated with a number of other potential predictors so it’s important to disentangle these variables in future studies.

Conclusion

In conclusion, CBM seems to be a promising treatment add-on for treating addiction as well as other psychopathology. Older people seem to profit most from the training, but this variable is correlated with various other variables. Next research steps should be to clarify underlying mechanisms in CBM compared to other training-interventions, and address issues of implementation. Finally, it’s important for theoretical and clinical reasons to investigate the question of moderation: which training works best for whom?

Access: 
Public
Artikelsamenvatting bij On the scientific status of cognitive appraisal models of anxiety disorder van McNally - 2001

Artikelsamenvatting bij On the scientific status of cognitive appraisal models of anxiety disorder van McNally - 2001

Vaak wordt beweerd dat psychologen verschillende paradigma’s gebruiken om abnormaal gedrag te begrijpen, en dat elk onderdeel een unieke bijdrage levert. Dit is bijvoorbeeld voor het cognitieve paradigma niet het geval. Er worden twee schijnbaar niet te verenigen benaderingen onderscheiden:

  • Emotionele stoornissen zijn het resultaat van problematische beoordelingen en overtuigingen, gemeten door introspectieve zelfbeoordeling.

  • Emotionele stoornissen zijn het resultaat van abnormaliteiten in de

    informatieverwerking, waarvan de invloed wordt gemeten door openlijke

    gedragsuitingen (reactietijden bijvoorbeeld).

Het doel van dit artikel is de beoordelingsbenadering en de kritiek hierop te bekijken. Er wordt gepleit voor een methodologisch pluralisme.

Normaal gesproken bedoelt men de theorie van Beck als men de ‘cognitieve benadering van emotionele stoornissen’ aanhaalt. Beck hield zich voornamelijk bezig met psychoanalyse, maar zag al snel beperkingen en heeft een theorie en therapie geformuleerd om depressie te behandelen. Hierna volgden anderen met overeenkomstige theorieën over angst, OCD en PTSS. Het centrale thema in deze theorieën is dat overtuigingen ertoe doen. Overtuigingen vormen de causale basis van de verkeerde beoordeling van lichamelijke sensaties, gedachten en dergelijke, betrokken bij een angststoornis. De relevante overtuigingen en beoordelingen worden vastgesteld in de theorie. Een voorbeeld is het model over paniekaanvallen van Clark: iemand krijgt een paniekaanval als hij of zij lichamelijke sensaties verkeerd opvat als het voorteken van een ramp. Een ander voorbeeld is het model over OCD van Salkovskis: opdringerige gedachten worden alleen obsessies als de persoon ze op een bepaalde manier opvat. De beoordelingsbenadering is dus duidelijk een rationele benadering. Bovendien wordt er normalisatie van psychopathologie verondersteld: symptomen zijn te verwachten consequenties van problematische overtuigingen, waardoor zelfs neurobiologische ‘abnormaliteiten’ in een ander licht komen te staan.

Ondanks het succes van deze theorie en de daarop gebaseerde therapie, is er ook veel kritiek ontstaan. Lang vindt de verklarende waarde van de theorie beperkt. Daarnaast benadrukte Macleod dat mensen over het algemeen slecht zijn in het beschrijven van factoren die hun gedrag beïnvloeden.

Mensen zijn zich vaak niet bewust van de oorzaak van hun gedrag, en daardoor zijn de introspectieve evaluaties niet veel waard, is het argument. Zoals in het in de theorie wordt beschreven, is de manier van redeneren van patiënten met wanen bijvoorbeeld te bekijken tegen logische standaarden. Wanneer noemen we iets dan pathologisch? Overtuigingen kunnen dan wel rationeel te verklaren zijn, dat maakt de gebeurtenis zelf nog niet ‘normaal’ (denk aan opdringerige gedachten). De kritiek is voornamelijk op te delen in conceptuele en methodologische bedenkingen.

Lang (1988) benaderde de verklaring methodologisch: een beroep doen op de catastrofische interpretatie als oorzaak van paniek vraagt om de onderliggende oorzaak van paniek. Macleod (1993) heeft voornamelijk een methodologische bedenking: niet te observeren mechanismen kunnen best iets verklaren, maar deze moeten dan wel gemeten worden door objectieve indexen. Een zelfverslag is dat niet.

De overtuigingsbenadering heeft niet geleden onder de kritieken. Voornamelijk therapeutisch is deze benadering beduidend vruchtbaarder dan de informatieverwerkingsbenadering. Dat de therapie vaak gebruik wordt en effectief is, wil nog niet direct zeggen dat de theorie klopt. Ook de zogenaamde volkspsychologie zorgt ervoor dat de overtuigingsbenadering ‘in leven wordt gehouden’. Volkspsychologie is de alledaagse theorie die we in elke dag gebruiken om andermans gedrag te verklaren en voorspellen. Belangrijk met betrekking tot de betrouwbaarheid van introspectieve verslagen is de theorie van Tulving (1985): vooruitgang in de kennis van het geheugen betekent dat we verschillende soorten bewustzijn moeten vastleggen en kunnen meten, als een onderdeel van ervaring of als een afhankelijke variabele. Mensen gebruiken twee soorten geheugen om herkenning te beoordelen: kennen versus herinneren. Ook al is dit onderscheid gebaseerd op introspectieve fenomenen, het staat wel in relatie met belangrijke verschijnselen. In sommige gevallen zijn zelfverslagen gelijk aan de onderliggende cognitieve mechanismen. Bovendien bestaan er van veel aspecten van psychopathologie geen openlijke gedragsmatige uitingen (zoals obsessies in OCD).

Concluderend kan gesteld worden dat methodologisch pluralisme gerechtvaardigd is. Sommige zelfverslagen zijn misleidend, maar op andere momenten kan het precies zijn wat nodig is op een bepaald moment, afhankelijk van de vraag. Een zelfverslag is essentieel als men de overtuigingen van de patiënt wil weten. Over het algemeen kunnen mensen antwoord geven op ‘wat-vragen’, maar vaak niet op ‘hoe-vragen’.

Access: 
Public
Article summary with Increasing the efficacy of cue exposure treatment in preventing relapse of addictive behaviour by Havermans & Jansen - 2003

Article summary with Increasing the efficacy of cue exposure treatment in preventing relapse of addictive behaviour by Havermans & Jansen - 2003

Introduction

The treatment in which a drug addict is repeatedly exposed to stimuli associated with addicted behaviour is known as Cue Exposure with Response Prevention (CERP). The response that these cues evoke is generally considered to be conditioned drug responses. These can be psychophysiological (changes in heart rate), behavioural (drug-seeking behaviour), or subjective (craving). These responses increase the chance than an addict will take drugs.  The treatment leads to the gradual extinction of the conditioned drug response. Unfortunately, no research has yet found a substantial effect on relapse by extinguishing cue reactivity. Apparently the effect (extinction of cue reactivity) does not generalize beyond the treatment setting. This can be explained by the fact that conditioned responses can recover after extinction, and extinction is seeing as the ‘unlearning’ of something which is meant to be permanent.

This article describes a number of ways in which CERP can be improved so that it’s more effective in combating relapse. First it describes how contemporary learning theory defines extinction and how it explains the recovery of conditioned responding. Second, different methods to prevent the recovery of extinguished cue reactivity are discussed. Then suggestions for future research are addressed.

Theoretical Notes

Pavlov indicated that extinction is not forever. Extinguished responses can recover; he calls this spontaneous recovery. In addition, the renewal effect shows that extinction is not the same as learning form an association. The renewal effect means that a conditioned response after cancellation can be renewed if the conditioned stimulus is presented in an environmental context different form the context in which the extinction took place. The statement by Bouton (1994) reads as follows: conditioned stimulus has acquired an ambiguous meaning after extinction. After extinction, a stimulus predicts the presentation and absence of an unconditioned stimulus. So the conditioned stimulus has a stimulating meaning during conditioning and an inhibitory meaning during extinction. Spontaneous recovery of a response can occur because someone doesn’t understand the meaning of a conditioned response when it’s presented in a different environment than the one it was learned in.

The treatment of drug addicts is difficult because they use drugs in many different places and situations. So there are too many practical problems to offer a ‘standard’ CERP. Offering CERP in different environments would increase the chance of generalization. However, offering CERP in multiple environmental contexts could cause a delay of extinction. Moreover, it’s unclear how many different contexts are needed. The perfect CERP would be a treatment that considers and controls for the probability of renewal effects and spontaneous recovery, and takes into account the above problems.

A promising option is to add so-called retrieval cues in the CERP treatment. Retrieval cues are striking features of an extinction environment, whereby the inhibitory meaning of the conditioned response is recognized outside the environment. If a retrieval cue is presented outside the original environment, this reduces the chance of renewals. An example of a retrieval cue would be using a reminder card.

Conclusion

learning and motivation is a dynamic research area, and so, contemporary learning theory continues to be an important resource for the implementation and development of behavioural therapy. Based on principles derived from contemporary learning theory, CERP incorporating retrieval cues can be a promising adjustment of standard cue exposure treatment. It doesn’t require extending the length of treatment and may control for the previously mentioned limitations.

When cue reactivity is extinguished and recovery of conditioned drug responding can be prevented by cue exposure with retrieval cues, it’s predicted that relapse of addictive behaviour can be more successfully prevented.

Access: 
Public
Article summary with Classical conditioning and the acquisition of human fears and phobias: A review and synthesis of the literature by Davey - 1992

Article summary with Classical conditioning and the acquisition of human fears and phobias: A review and synthesis of the literature by Davey - 1992

Introduction

Current conditioning models are a source of information about learning and performance in various ways. Firstly, it’s now possible to use these models to describe in a reasonably accurate manner the conditions under which associative learning occurs. Secondly, there are important non-associative processes that influence the strength of a conditioned response. Thirdly, the discovery of these non-associative processes has consequences for treatment. There are important procedural and dynamic differences between animal and human conditioning. That’s why it’s important to have a model that, with knowledge about animal conditioning, forms a framework for human conditioning.

Conditioning Processes in Humans

An important development in animal conditioning is that more and more inferential techniques are being used, so that there is now also information about aspects that we can’t see. The two most important developments are the types of associations that are formed during different types of classical conditioning, and the nature of the cognitive representations that influence the conditioned response (CR). Post-conditioning stimulus revaluation is an illustration of this. This procedure consists of three phases:

  1. Animals are given pairings of a conditioned stimulus (CS) and unconditioned stimulus (UCS) together until a CR is formed.
  2. Then they receive the UCS alone a number of times – revaluation.
  3. They receive test presentations from the CS.

The logic behind this is that is that if the CR is mediated by an association between the CS and UCS, then revaluation of the UCS will also affect the CR. In contrast, if the CR is mediated by a reflective association between stimulus and response, then the CR won’t be affected by revaluation, because the UCS has no mediating role. We assume that animals generally learn associations between CS and UCS. This has various implications for our conception of Pavlovian responding. Firstly, it means that the CS activates an internal representation of the UCS and that this representation mediates the CR. Secondly, it means that the strength/nature of the CR depends on how the UCS representation is assessed. Thirdly, the strength of the CR can be modulated by procedures that lead to revaluation of the UCS representation.

It’s been proven in two ways that people also learn the association between CS and UCS. First, research shows that people only show a differential CR if they can verbally represent the contingency between CS and UCS. Secondly, post-conditioning revaluation of the CUS appears to influence the strength of the CR. The revaluation processes can be very important for modulating the strength of the CR in humans. There are a number of ways in which people can achieve revaluation. This can be done through direct experience with the UCS, through socially or verbally transmitting information about the UCS, or through a person’s response/reaction to the CS or UCS. Revaluation can influence the strength of the CR independently of other experiences with the contingency between CS and UCS.

Currently, most animal conditioning models are based on contingency rather than proximity. This means that they emphasize that the proximity between CS and UCS is not sufficient to cause conditioning. More important is the predictive significance of the CS that heralds the UCS. To predict the relationship or covariation between events, situational information and previous expectations about the covariation are needed. The so-called covariation bias can cause a disturbed observation of the covariation. It’s important here that previous experiences can influence the strength and course of conditioning in people.

Latent inhibition – when a CS is repeatedly presented alone before conditioning, it’s more difficult to from an association between the CS and UCS than if the CS is not presented alone. This process can also be shown in people.

Another phenomenon is known as ‘blocking’; if the UCS is predicted by a CS (e.g. A), the combination of the CS (A) and a new component (e.g. B) has little or no effect on B. Blocking is an example of a combination between CS and UCS with no CR. Examples of blocking in people are not always found to be reliable.

Another phenomenon known as sensory preconditioning – if an animal is presented with two neutral stimuli (CS1 and CS2), no behavioural changes occur. However, if the animal then combines the CS2 with a UCS, presentation of the CS1 will also generate the CR associated with the UCS. This phenomenon has also been found in humans. It shows that associative learning can occur without the presents of a differential CR and without an aversive UCS.

Higher-order conditioning means that if the CS has an association with the UCS and can evoke a CR, then the CS can also be used to enhance other possible CS stimuli. Experiments in humans show that the CS2 can be associated with the UCS representation created by CS1, because neither the response to CS1 nor the CS1 itself can make the memory of the aversive UCS disappear.

The foregoing information has made it clear that human conditioning has the most important associative characteristics of animal conditioning. In addition, it’s important to reflect the contingent learning of people in terms of expectations. In other words, people assess the relationship between the CS and the UCS by looking at the relevant information and forming an expectation based on this information. People differ from animals in the sophistication and diversity of information sources. In addition, it’s important to include certain human qualities with regard to nonassociative factors. For example, people can handle problems in a complex way and have different coping styles. In the model illustrated in this article, the ability of the CS to activate a cognitive representation of the UCS depends on the variety of factors that determine what the person expects. The strength of the CR is modulated by processes that influence the assessment of the person. This model differs from traditional conditioning in two ways.

  1. The association between CS and UCS is influenced by factors other than the perceived contingency.
  2. The strength of the CR is determined by non-associative factors that influence the evaluation of the UCS.

Traditional Criticisms of Conditioning Accounts of Phobia

Now that a model has been established and discussed, it’s possible to review a number of criticisms. First of all, one of the apparent difficulties with a conditioning account of phobias was that it didn’t seem to be able to predict the conditions under which someone would not develop a phobia, since not all individuals experiencing pain or trauma paired with a situation develop a phobia of that situation. The current conditioning theory has various explanations for this: latent inhibition and the dependence on the evaluation of the CUS. Current theory does not expect phobia to arise if the traumatic UCS receives less value immediately after the experience.

Secondly, many anxious people can’t remember trauma. The current model sees the acquisition of an association between CS and UCS and the modulation of a negative UCS as two relatively independent processes.

Third, it was often found that CS presentations only increased anxiety, even though the UCS was not present. this can be explained by the influence of the individual evaluation. Anxious people appear to revive the trauma in their heads and to focus on the traumatic components of an event. This could be sufficient to prevent extinction.

Fourth, there appears to be a disproportionate distribution of fears in the clinical and non-clinical population. Fears are connected with certain situations and events (snakes, spiders) and not with other situations, while other situations (cars, electricity) are often more dangerous. From an evolutionary point of view, an explanation could be that our ancestors have created a predisposition to stimuli that were once a threat. This statement has received much criticism, partly due to the lack of predictive power. A lighter version states that one possesses selective association, meaning that certain incentives are more easily associated with consequences. This so-called ‘willingness’ is not determined evolutionally, but rather by earlier conviction about the relationship between stimuli and outcomes. This can also be found in the current model; the expectation plays a role in this.

The Constituency of Conditioning Models

The purpose of classical conditioning in humans is greater than in animals. We’re not so interested in the factors that determine associative strength, but rather in the understanding of all factors that determine the strength and persistence of a CR. That is why a model is needed that focuses more on the prediction of a performance than on the underlying processes Rachman outlined a ‘three pathways’ model of acquisition:

  1. Normal conditioning through direct experience with the trauma
  2. Transmission of information
  3. Observational learning

There’s evidence that part of the acquisition of phobias goes through the ‘other’ routes, and not through direct experience of a traumatic event. This form of conditioning seems to be more important if there was little prior experience with the stimuli, for example in childhood. Expectations can have two effects. First, they can have an effect on the assessment of covariation and strength of the CR. Secondly, they can develop a differential CR without direct experience with the UCS.

Revaluation can function in a number of ways in fears and phobias. An example is in denial, which appears to have a positive effect in the short term. In addition, many people use a certain way of devaluation as a coping strategy. They make the threat more neutral, use positive comparisons, selectively ignore elements or make the event less important. This coping strategy reduces fear. In addition, different events in a person’s life can influence the evaluation of a trauma. An example of a sudden shift in anxiety is spontaneous remission, in which the symptoms suddenly disappear. This seems to happen more often if there are positive changes in one’s life. The revaluation of trauma is therefore important in the modulation of anxiety. Moreover, it is a determinant of the strength of a CR.

The Scope of Conditioning Models of Phobias

For which fears is such a conditioning model relevant? Not only for simple phobias acquired through conditioning, but also for phobias that arise from, for example, observation. The relevance of the model for mild non-clinical fears is not entirely clear. A major obstacle is the lack of empirical evidence. It must be established to what extent associative phenomena and the effects of revaluations are identified in terms of ontogeny, remission, and maintenance. Moreover, it’s suggested that conditioning is involved in panic attacks. The ‘catastrophic interpretation’ of feelings is an important part of this. The model also suggests a combination of associative and cognitive factors.

Implications for Treatment

The most important aspects of the current conditioning approach to anxiety are the consequences for therapy:

  • With latent inhibition and blocking in mind, regular nontraumatic exposure to situations as early as possible seems very useful. This is a preventative method that is already widely used (e.g. in dentistry).
  • Acquiring fears and phobias is influenced by individual coping strategies, which can be influenced by coping skills and positive thinking.
  • The revaluation of stimuli is important, for example by convincing the patient that the stimulus is no longer a threat. Evaluation and expectation don’t’ seem to depend solely on verbal information, making therapy more difficult.
  • Selfobservation of a person’s response to a feared stimulus can be a cause for changing the evaluation of the UCS, exposure can help with this.
Access: 
Public
Article summary with Expectancy-learning and evaluative learning in human classical conditioning: Affective priming as an indirect and unobtrusive measure of conditioned stimulus valence by Hermans a.o.- 2002

Article summary with Expectancy-learning and evaluative learning in human classical conditioning: Affective priming as an indirect and unobtrusive measure of conditioned stimulus valence by Hermans a.o.- 2002

Contemporary cognitive models of classical conditioning provide a framework for understanding people’s fears and phobias. According to these models, classical conditioning is the acquisition of associations between representations of stimuli or events. For an explanation of classical conditioning, see a number of previously summarized articles.

Recently a distinction has been established between two different forms of learning in classical conditioning: expectancy learning and merely referential learning. Expectancy learning arises when showing the CS activates the expectation of a real US event. Referential learning arises when showing the CS activates the representation of the US, without the expectation of the US also arising. So the CS ensures that someone thinks of the US, but doesn’t expect the US.

Evaluative conditioning (EC) is a type of referential learning: the observation that the presentation of neutral stimuli with (dis)liked stimuli changes the valence of the stimuli in a (negative) positive direction. Evaluative learning resembles expectancy-learning in that it is sensitive to US revaluation. So the shift in valence of the CS seems to be based on the association between Cs and US representations. Research into evaluative learning and expectancy learning differs in two areas:

  1. The type of learning: evaluative conditioning studies often used rather unobtrusive stimuli like flavours, or pictures of human faces, paintings, statues etc. that were rated as positive or negative by the individual participant. Whereas expectancy-learning have utilized biologically significant stimuli like food or shock, that are unconditional stimuli in a strict sense: they elicit a similar and innate/unlearned positive or negative response in all subjects.
  2. Response systems are different: evaluative investigations focus primarily on the verbal ratings of CS valence and expectancy investigations primarily on the nonverbal motor/autonomous responses such as the skin conductance response.

In addition to verbal assessments, it’s important to investigate nonverbal and physiological responses. A good way is the performance on affective priming tasks. Positive or negative stimuli are presented, which the participant must give a value as quick as possible to (positive or negative). Each stimulus is preceded by a prime stimulus, which is negative, positive, or neutral, and must be ignored. The valence of the prime stimulus appears to influence the speed at which the stimuli are assessed. This is based on the automatic processing of the prime stimulus.

Current research is looking into whether evidence can be found for both forms of learning within the same conditioning procedure. This conditioning procedure includes images of human faces as CS+ and CS- and an electrocutaneous stimulus as US. The valence was investigated through the affective priming procedure and verbal assessments. Expectancy learning has been examined by assessments of US-expectancy. Arousal and fear ratings were also assessed for exploratory reasons.

In addition, this study aims to analyze whether evaluative learning as demonstrated in aversive conditioning is comparable to evaluative learning with non-aversive US. In addition to the CS+ and the CS-, therefore, CS-positive and CS-negative were included, positive and negative adjectives, respectively. Finally, this study looks at the extent to which specific instructions for imagining can enhance learning. Previous research has shown that mental imagery can potentiate the effects of acquisition.

Method

32 students (11 men, 21 women) were examined.

Experiment consisted of three phases:

  1. Stimulus selection: participants assessed 60 photos of faces. The intensity of the US was also measured in this phase by means of electrodes.
  2. Acquisition: CS+, CS-, CS-positive, and CS-negative were shown to the participants. A corresponding sound was played during presentation of CS-positive and -negative.
  3. Affective priming: the affective priming task was offered.

Results & Discussion

After the acquisition, the participants indicated that the "electrocutaneous US" was intense and unpleasant and that they were shocked. During the acquisition, the CS + had become a valid predictor of the US. Not only did the participants find the US annoying and they were aware of the contingency between CS + and US / CS- and not a US, expectation learning had also emerged. In addition, there have been shifts in the evaluation of the CS + and CS-. After acquisition, the participants gave the US with positive adjectives a positive rating and the negative US a negative rating. No expectation learning is involved in the conditioning of the CS positive and the CS negative. Moreover, no significant difference was found in the degree of evaluative learning between the two types of conditioning procedures.

To increase the level of learning, half of the participants were asked to imagine something about the relationship between the CS and the specific US. It turned out that the two groups did not differ in the degree to which they liked or disliked the different types of US. There was also no significant effect on the level of evaluative learning. The degree of arousal was influenced by the conditioning procedures: the CS + was found to be more arousing than the CS-. The CS + also generated more fear than the CS-. The CS negative generated more fear than the CS positive.

In conclusion, the experience of contingent presentation of a neutral face (CS +) and an aversive electrical stimulus (US) has changed the experience of the CS + in two ways:

  • The CS + became a predictor for the US.
  • The CS + itself became more negative.

Thus, evaluative learning and expectation learning can occur together. Moreover, the results of the affective priming procedure prove that affective priming can occur with aversive conditioned stimuli. An alternative to this affective priming procedure is the modulation of an acoustic shock reflex: a defensive reflex due to an unexpected sound. This varies with the valence that one gives to a picture. This modulation can be a good way to measure valence after acquisition, but it cannot measure the effects of extinction. A substantial fright reflex is only found with pictures that cause a lot of excitement, and they usually disappear after extinction.

In addition to the fact that expectation learning and evaluative learning can occur together, a conclusion can also be drawn from the results found. This is that the amount of evaluative learning resulting from an aversive conditioning procedure has no interaction with the type of conditioning preparation. Thus, evaluative learning can also be generalized to conditioning procedures with more biological US stimuli. This shows that clinical fears based on the experience of contingency between the neutral stimulus and negative stimulus cause not only that the CS becomes a predictor of a negative experience, but also that the CS is viewed more negatively. A combination of exposure and counter conditioning would be a good way to prevent relapse of patients.

A number of things can be interesting for follow-up research. Firstly, to investigate whether the conditioning effects found are also found in other types of stimuli. Secondly, to investigate whether the same evaluative shifts are found if the CS is subsequently masked during acquisition. Third, to investigate the precise nature of the conditioned response, which remains after extinction.

Access: 
Public
Article summary with Renewal and reinstatement of fear: Evidence from human conditioning research by Vansteenwegen a.o. - 2006

Article summary with Renewal and reinstatement of fear: Evidence from human conditioning research by Vansteenwegen a.o. - 2006

Reinstatement of fear is defined as the reappearance of fear that has undergone partial or complete extinction. This return may be influenced by factors that were present before, after, and during treatment. Episodes of the return of anxiety often occur along with stressful events or major changes in a person’s life. This summarized chapter focuses on exposure and the return of anxiety. In this context, use if made of theory and research on extinction with classical conditioning. In addition, human conditioning is discussed. Thirdly, two posttreatment manipulations are examined: renewal (changing the context after extinction) and reinstatement (the presentation of unpredicted unconditioned stimuli (USs) after extinction).

Exposure and Extinction

The behavioural treatment of anxiety disorders often includes systematic and repeated exposure to the stimulus that evokes anxiety. Exposure corresponds to extinction on some points. During extinction, the conditioned stimulus (CS) is repeatedly presented without the US. During exposure, the patient is repeatedly exposed to the stimulus that generates anxiety. However, extinction is not the same as un-learning, but rather involves an additional learning experience. Two associations arise through extinction: a CS-US association/excitatory link and a CS-noUS association/inhibitory link. Four postextinction restoration phenomena prove that extinction is not the unlearning of an association: spontaneous recovery (return after the mere passage of time), disinhibition (return after the presentation of a novel stimulus), reinstatement (return after re-exposure to the US), and renewal (return after a context change). The best explanation is that exposure treatment temporarily or contextually masks the fear-arousing association but does not erase it.

Underlying Mechanisms of Extinction

The majority of knowledge about underlying mechanisms of extinction come from animal studies though there are differences between conditioning research in humans and animals. In animal conditioning, acquisition and extinction are under full experimental control. In human studies, there is no control over the acquisition of fear/anxiety. A solution to this lack of control is a laboratory experiment in which fear is established before it is eliminated. the strength of conditioning studies is that they distill theoretical assumptions to their essence and allow studies of the phenomenon under strictly controlled circumstances, hereby permitting better analysis of the underlying mechanisms. Of the different human conditioning paradigms, the human fear-conditioning paradigm is the paradigm that comes closest to a real-life fear situation.

Human Fear Conditioning

In the conditioning paradigm, neutral visual stimuli are used as CS and either an electrocutaneous stimulus or a loud aversive noise is used as US. One of the two visual stimuli is systematically followed by a US (the CS+ and the other not (the CS-). With the extinction, both stimuli are presented without the US. Associative learning effects are measured by a decrease or increase in differential responses to the CS+ and CS-. Because emotions can only be measured through behavioural, verbal, or psychosocial changes indexes for this have also been included in the study. The physiological indexes are skin conduction and shock modulation. The behavioural indexes are affective priming and the secondary reaction time task. Affective priming uses positive and negative words for the CS_ or CS- to indirectly measure valence. The secondary response time task emits a tone during the presentation of CS+ and CS-. The participant must press a button when he hears the tone. Response times are expected to be slower with the CS+ than with the CS-. On the basis of thee evidence, it’s concluded that CS+ after acquisition has become a fearful stimulus, as defined by properties of negative valence: high arousal.

The data related to extinction results from this experiments are: conditioned verbal expectancy from the US, electrodermal responding, shock responses in the form of blinking eyes, and the response times from the secondary response time task. In the paradigm used here, it’s found that that the CS+ becomes a fearful stimulus to acquisition and at the end of a standard extinction procedure, the CS+ has lost at least one essential aspect of fear, namely, US expectancy. However, the valence of the CS+ seems more difficult to change.

Renewal of Fear

Renewal is defined as the return of extinguished conditioned responses caused by changes in the contextual cues that were present during extinction. The most observed form is ABA (being tested in the original environment, while extinction took place in a different environment). ABC and AAB are also found.

Evidence from the clinical literature also emphasizes the importance of the environment. Acquisition easily generalizes to a new context, but extinction does not. Within the paradigm used, evidence has been found that extinction in a context other than that in which the acquisition took place or with a stimulus other than the acquired stimulus can cause the return of fear. The aforementioned difference in generalization between acquisition and extinction is not found in an AAB design, whereby the acquisition and extinction took place in the same context. Renewal in AAB appears to be more difficult to obtain than in an ABA design. That AAB effects are weaker than ABA effects appears to be useful for therapy: it can help to imitate the original acquisition situation during therapy (if possible).

Mechanisms of Renewal and Clinical Implications

Bouton and his colleagues developed a contextual theory about extinction: the context of extinction has a modulating role and helps to eliminate ambiguity between old knowledge (CS-US association and acquisition) and the new knowledge (CS-no US association or extinction). In this theory, renewal is a consequence of leaving the extinction context. Two alternatives are suggested to explain the context specificity of extinction:

  • Forming a direct inhibiting association between the context of extinction and the US. The patient learns in practice that the context of therapy is safe.
  • Perceiving the fearful object as a partially different stimulus in the context of exposure than in the context of acquisition. In the literature on animal conditioning, this is called generalization decrement.

Prevention of Renewal

In the literature on animal conditioning, two strategies are proposed to stimulate transfer to other contexts and prevent renewal. The first is the use of retrieval cues: stimuli that can be taken from the context of extinction to the test context and retrieve information from the extinction-exposure episode. The second method is extinction in multiple contexts. In humans, the effectiveness of retrieval cues has been demonstrated by means of a differential human contingent learning experiment. It was also found that manipulating contexts effectively reduces the renewal of fear of spiders. The method that uses retrieval cues does not change what has been learned during exposure: the association still depends on the context. The method that uses multiple contexts, on the other hand, effectively increases the number of contexts that may have an impact on what has been learned. Two mechanisms are important to explain renewal: the modulating mechanisms (context that modulates the CS-US relationship) and the direct inhibiting association between the context and the US.

Reinstatement of Fear

Reinstatement is defined as the return of extinguished conditioned responses caused by the experience of one or more US-only presentations after extinction. An experiment to stimulate this first uses the acquisition of a conditioned response through the contingent presentation of a CS and a US. Subsequently, the CS is repeatedly presented alone, causing extinction. The US is then shown without the CS being displayed.

Reinstatement appears to be a robust phenomenon. But research shows that conditioned contextual stimuli are necessary to obtain reinstatement. To deal with this context-dependency observation, the context-CS summation view was formulated. During the reinstatement phase, reinstatement context becomes excitatory, and this contextual conditioning is assumed to summate with the residual associative strength of the CS after extinction.

Another explanation is that reinstatement is seen as a special event caused by contextual cues. Leaving the context of extinction makes it more difficult to retrieve the CS-noUS association.

Mechanisms of Reinstatement and Clinical Implications

The fear-conditioning paradigm allows for the manipulation of several variables that may elucidate the mechanisms that drive reinstatement. For example, if people also find that reinstatement can take place with a US differing from the original US, there are many more possible circumstances in which fear can arise again. In addition, it is true that reinstatement can, to a certain extent, be considered a special case of renewal, this would suggest that methods available to reduce renewal may be applicable to attenuate reinstatement.

For human conditioning, it remains unclear whether reinstatement requires the ‘new’ US to share properties with the original. As described earlier, the valence of a stimulus doesn’t seem to disappear due to extinction. The authors of this chapter suggest that valence is an explanation for individual difference in the strength of the reinstatement effect. 

Access: 
Public
Article summary with A cognitive-motivational analysis of anxiety by Mogg & Bradley - 1998

Article summary with A cognitive-motivational analysis of anxiety by Mogg & Bradley - 1998

Introduction

An important consideration in the cognitive analysis of emotions is their evolutionary origins. Regarding anxiety, there’s a system that ensures that attention is paid to threatening environmental an interoceptive stimuli that are relative to the person’s motivational state. This allows a quick and effective response to threatening stimuli. In the case of fear, a bias can then arise in selective attention to threatening information.

A feature of recent cognitive theories is the emphasis on the role of attentional processes in etiology and the maintenance of anxiety. Something can be said about all of these theories, which is why it’s subsequently explained from a cognitive-motivational point of view what the specific mechanisms are that are responsible for an attentional bias in anxiety.

Cognitive Theories

According to recent cognitive theories, biases in information processing play an important role in the etiology and maintenance of emotional disorders, such as generalized anxiety disorder (GAD) and major depressive disorder (MDD). Beck developed a schema model where relevant schemas have become dysfunctional and Bower proposed a semantic network theory of emotions where each emotion represents a node in an associative network in memory.

Both theories say that anxiety and depression are associated with an emotion-related bias in all aspects of information processing. However, this is not entirely true, because anxiety is mainly associated with a bias in selective attention and depression with a bias in selective memory.

Williams (1988) then developed a revised cognitive theory about anxiety and depression. Some characteristics of this theory are:

  • Different emotional disorders are associated with different patterns of cognitive bias.
  • People who have a permanent tendency to show preattentive, automatic vigilance for threat are more susceptible to the development of anxiety disorders when under stress.
  • Trait anxiety influences the direction of preattentive and attentional biases to threat stimuli. People with high trait anxiety orient their attention towards the threat and people with low trait anxiety direct their attention away from it. This is the interaction hypothesis.
  • Cognitive behavioural therapy is effective because these biases are corrected.

Williams then says that two mechanisms are responsible for the pre-attentive and attentional bias towards threats. This is the Affective Decision Mechanism (ADM) and the Resource Allocation Mechanism (RAM). The ADM assesses the degree of threat of stimuli. The output of this mechanism depends on the stimulus, but also on the degree of anxiety that the person currently has (state anxiety). The output serves as input for the RAM, which determines the allocation of resources of the processes.

The effect of RAM depends on the degree of trait anxiety of the person. As the output of ADM increases, the difference between people with high and low trait anxiety (interaction hypothesis) becomes clear.

Later, the Williams model was adapted using connectionist terms (1997). However, the interaction hypothesis remains an important key point in the model. Various other theories are an addition to this model. The main focus is on the assumption that preattentive processes play a role in anxiety. These processes are important in the evaluation of stimuli and in giving direction to the focus of selective attention and the response that is given as a result. These processes are said to be vulnerable to the development of clinical anxiety.

However, a number of comments can be made about these cognitive theories:

  • The interaction hypothesis says that people with low trait anxiety turn away from a threatening stimulus. However, this is not smart when it comes to really threatening stimuli and would mean that the threat detection system is not functional.
  • Several studies have found a bias in preattentive processes in anxiety disorders, but not in depression. This is strange, because depressed people also have a higher level of anxiety
  • The theories make little use of recent neurobiological systems that play a role in anxiety.

Below are two biological formulations of anxiety:

  1. Gray says that a vulnerability to anxiety is associated with individual differences in Behavioral Inhibition System (BIS) activity. This system compares current with expected stimuli and takes action if they do not match or if there are aversive stimuli. BIS ensures inhibition of current behavior, which increases the arousal and for greater attention to the stimuli. Anxious people have an overactive BIS.
  2. LeDoux is known for his belief that the amygdala plays a major role in emotions and that it is activated by a threatening stimulus. Attention processes play a lesser role in his model.

Cognitive-motivational point of view

According to this position, there are two motivation-related systems that, in combination with each other, play a key role in mediating fear. These are the Valence Evaluation System and the Goal Engagement System. They can each be placed on an axis in a two-dimensional network.

A motivational analysis of these factors says that a bias in pre-attentive processes and initial orientation with regard to emotional stimuli depends on the combined functioning of valence evaluation and goal engagement. According to the cognitive-motivational point of view, the Valence Evaluation System ensures the value of a threatening stimulus. This depends on the stimulus itself, but also on the context, the degree of anxiety at that moment (state anxiety), previous experiences, and the degree of arousal.

In addition, the Valence Evaluation System is more sensitive to people with high trait anxiety. Trait anxiety is then seen as an increased output of the Valence Evaluation System, which reflects a bias in the valuation of threats. The output from the Valence Evaluation System serves as input for the Goal Engagement System, which is responsible for mediating preattentive and attention responses. Nothing is done about it with little threat. In the event of a high threat, this is immediately aimed at and other processes are stopped.

According to this view, there is no linear relationship between the valuation of threatening stimuli and attentional bias. When there is no threat, no attention is paid to the stimulus. With a little threat, the attention goes away from the stimulus. With a medium to high threat, there is an increased degree of attention given to the stimulus that ultimately gets stuck at a certain (high) level.

Some advantages of this system are:

The cognitive-motivational view says that an attentional bias or pre-attentive bias does not play a causal role in the etiology of an anxiety disorder. Such a bias can also be found in people with low trait anxiety when the stimuli have a high threat.

The presence of such a bias for mildly threatening stimuli may be a sign of a sensitivity to anxiety, but this may not be a determining factor. The primary factor that determines this sensitivity, according to the cognitive motivational point of view, is a bias in the Valence Evaluation System. The attention processes may possibly be important in the maintenance of anxiety.

Research into attentional bias in anxiety

Studies that investigate a bias in selective attention to emotional stimuli use the (emotional) Stroop test, probe detection task and color perception task, the latter being a variant of the probe detection task, where a word pair is represented of which a word is neutral and the other emotionally. After that, two colored fields appear instead of the words at the same time, but fearful people think that the colored field that replaces the emotional word appears earlier. The Stroop task bases the conclusions on interference, but these are complex and sometimes difficult to interpret because of confounds and "strategic override". The latter emerges when phobics are tested in the vicinity of the feared stimuli. At that moment the interference effect disappears. The probe detection task is less sensitive to this.

The problem with all these tasks is that it only exposes part of the attentional bias, because the tasks depend on the time the stimulus is presented. All of these tests provided evidence for attentional bias, but were performed with stimuli that the test subject could consciously focus on.

Studies that investigated whether there is also a bias for selective attention to threatening information that is offered unconscious (pre-attentive bias) have used dichotic listening tasks and visual masking paradigms. The latter are a better means to counteract consciousness and have been performed with the Stroop test and the dot probe task. To check whether the test subjects were really unaware of the stimuli, a consciousness check was performed in two ways: test subjects were asked whether they saw a word in the masked phase and whether the letters were displayed word or not. This showed that the thresholds of consciousness for different stimuli are different. Someone may be aware of a stimulus, but not of its content. With both test designs, it was ultimately demonstrated that there is also a pre-attentive bias in people with an anxiety disorder. This suggests that the bias is already at an early stage of processing information.

It has also been shown that the pre-attentive bias for threatening stimuli can be a cognitive vulnerability factor for emotional disorders. This means that people who have a permanent tendency to automatically selectively focus attention on threatening stimuli are more likely to develop an anxiety disorder when in stressful situations.

The influence of state and trait variables on preattentive and attentional bias

Evidence for the relative influence of trait and state variables on attentional bias contains three areas, namely correlations between bias and self-reported anxiety, the effect of short and long term stressors on preattentive and attentional bias, and the effect of reducing state anxiety through treatment. The last two are discussed.

The effect of stressors

The interaction hypothesis contains two components that are important for the mechanisms that underlie attentional bias. The first is that there is no difference between people with high and low trait anxiety when there is no stress. As the state anxiety increases, a difference becomes apparent and the people with high trait anxiety become sensitive to threatening stimuli. This sensitivity therefore appears to be a latent and not a manifest vulnerability factor for fear.

The second component deals with responses to the attention of people with low trait anxiety. These people tend to divert their attention from a threatening stimulus.

The first part of the hypothesis has been confirmed in various studies, although short-term stressors also elicited attentional bias in people with low trait anxiety, long-term stressors only elicited attentional bias in people with high trait anxiety. If evidence for the second component were also found, this means that the cognitive key factor underlying the vulnerability to anxiety is mediated by the mechanism that determines the direction of attention bias.

No significant evidence was found for the second part of the hypothesis, as a result of which the relative effects of state and trait anxiety on preattentive and attentional bias remain uncertain. It is true that no distinction is made between different options such as.

A threshold effect where the effects of state and trait variables are additive so that vulnerability to threatening words is primarily shown by people with high trait anxiety who are under stress.

A curvilinear relationship between the threatening value of the stimulus and the attention responses to the threat.

Different patterns of bias in people with high or low trait anxiety, which become apparent under increasing state anxiety. There is not enough evidence that with increasing threat people with a low trait anxiety turn away from the stimulus.

The effect of anti-anxiety treatment

It was argued that if the attentional bias for threat is a long-term cognitive vulnerability factor for anxiety, it is still present after treatment. However, studies show that this bias does disappear after therapy (in people with a generalized anxiety disorder). The question was whether a change had taken place in controlled strategies or in the automatic unconscious processes. The latter turned out to be the case.

Attention bias for illustrated stimuli

The studies described above have only been performed with the help of word stimuli and these have a limited scope in terms of threatening value. Illustrated stimuli such as emotional faces or scenes have a somewhat larger scope.

There are a number of tasks that can be used to examine an attentional bias for emotional faces, including the dot-probe detection task, popout tasks, and eye movement. Most of the studies revealed emotion-related effects, although there was a variability in the results with regard to the primary influence on attentional bias. Some claimed it was state anxiety, others trait anxiety or negative emotional valence in general. Differences have been found with regard to studies that have used word pairs. For example, with illustrated stimuli, there is no need for a certain stressor to provoke the bias for threatening faces. This is true with word stimuli. The hypothesis that an attentional bias for threatening stimuli is a characteristic for only people with a high trait anxiety therefore only applies if it concerns relatively weak threatening stimuli such as words.

The interaction hypothesis says that the more threatening the stimulus is, people with high trait anxiety focus their attention on the stimulus and people with low trait anxiety turn their attention away from it. The cognitive-motivational point of view says that an increasing threatening stimulus also causes a bias in people with low trait anxiety. If this did not happen, the Valence Evaluation System would not work.

Research with neutral and emotional scenes has indeed found that there is a general greater vulnerability to highly threatening scenes compared to mildly threatening ones. In addition, the low-fear group showed avoidance for mildly threatening scenes and this avoidance decreased as the stimulus became more threatening. People with high trait anxiety also exhibited greater vulnerability to threatening scenes than people with low trait anxiety. These results therefore speak against the interaction hypothesis and for the cognitive motivational view.

Time course of attentional bias

There is therefore sufficient evidence for a bias in pre-attentive attention and attention processes, but the research described so far focused primarily on initial orientation towards the threatening stimulus. The question now is what happens next. The hypothesis is that people with high trait anxiety or clinical anxiety first focus their attention on the stimulus and then try to avoid detailed processing taking place so as not to become overly anxious. This can ensure that the fear is maintained.

Preattentive and attentional bias in depression

The question is whether depression is associated with an attentional bias for negative stimulation similar to the bias in anxiety. Studies have shown that depressed people do not have a pre-attentive bias for negative information, so do not automatically focus on this. When the stimulus is offered longer and the information has come to the attention, they have a greater difficulty in separating themselves from it. Depressed people do show a bias with regard to later processes such as sustained attention. This is consistent with the predictions of the cognitive motivational point of view, which says that the manifestation of a bias for aversive stimuli in preattentive and orientation processes depends on two things: goal engagement and valence.

Access: 
Public
Article summary with The effect of a single-session attention modification program on response to a public-speaking challenge in socially anxious individuals by Amir a.o. - 2008

Article summary with The effect of a single-session attention modification program on response to a public-speaking challenge in socially anxious individuals by Amir a.o. - 2008

Introduction

Research shows that socially anxious people may focus their attention primarily on threatening information. Selective attention to negative social incentives leads to more fear in this case and distorts the assessments of social events. This in turn in leads to ineffective social behaviour. A commonly used task to investigate the attention bias related to social anxiety is the probe detection task with faces. The participant is shown two faces, one neutral and the other threatening. In certain trials, one face is replaced by a probe (a letter for example). Participants must press a button when they see the probe. Faster response times for probes that replace the threatening faces indicate a bias for threatening information. Several studies have confirmed this bias, but there are also studies that have not found a significant attentional bias. An explanation for this inconsistency is that even if there is an effect, it is not expected that every study will find significant results, unless the effect size is very large, which is very rare in psychological research. Also in research where participants were randomly assigned and their attention manipulated, the hypothesis that an attentional bias towards threatening stimuli confers a sensitivity to negative affectivity in stress was confirmed. However, it may also be that the differences found show the direct and indirect measurements of the task on the mood of the participants. In general, individual differences in focusing attention on incentives relevant to threat and negative information appear to be important to mediate vulnerability to negative affectivity.

This study

Current research analyzes whether the attentional bias for threat is causally related to the preservation of social anxiety. To this end, the effect of a single session of attention training on reducing the anxiety response to a social stressor in people with social anxiety was investigated. Compared to the Attention Control Condition (ACC), it is predicted that the attention bias towards the threat will decrease through the Attention Modification Program (AMP). Less fear and better social performance are also expected.

94 participants were investigated for this. The following materials have been used:

  • Self-report measurements (questionnaires).
  • Behavioral assessments.
  • Attention bias modification stimuli: the faces in the dot probe task.
  • Attentional bias assessment stimuli: threatening and neutral words.
  • Probe detection task (Attention Bias Modification Task): dot probe task with faces.
  • AMP: the probe replaces the neutral face.
  • ACC: the probe replaces the neutral face one time and the threatening face the next, every condition as often.
  • Assessment of the attentional bias.

Attention training appeared to effectively reduce the attention bias towards threat and the fear response to a social challenge. Moreover, attention modification appears to be effective in individuals with a high level of anxiety. The most logical explanation is that the AMP task ensures that the participants separate their attention from the impending stimuli. It can be concluded from this that any procedure that normalizes the bias can reduce anxiety symptoms. In conclusion, the findings support a cognitive model of social phobia: selective attention to threatening social information may be causally related to the maintenance of pathological social anxiety. More and more research shows that fear is linked to deficits in attention control: anxious participants with poor attention control find it difficult to separate their attention from threatening information. In general, exposure is used as a therapy for fears. Current results suggest that not all forms of avoidance of frightening stimuli are bad.

Limitations

A number of limitations of this research are:

  • There is no follow-up data from the participants. So it is not known whether the effects are long-lasting.
  • The level of anxiety among the participants was average and no clinical interview was conducted to make diagnoses.
  • For future research it is important to investigate alternative conditions of attention control in anxious populations.
Access: 
Public
Article summary with Cognitive vulnerability to depression: A dual process model by Beevers - 2005

Article summary with Cognitive vulnerability to depression: A dual process model by Beevers - 2005

Introduction

Cognitive theories of vulnerability to depression state that cognitive factors are causes in the disorder. This article examines a cognitive theory for depression, namely the dual process model.

A General Dual Process Model

Most dual process models of cognition have three components. First, there is an associative processing component. Associative processing is responsible for fast and effortless information processing. Information previously associated with a particular stimulus becomes active again upon seeing the same stimulus. Past experiences facilitate the processing of current information. Smith and DeCoster (2000) state that associative processing is guided by the correspondence between the current stimulus and information obtained from a previous stimulus. Our impression of a stimulus is a combination of what we see at the moment and what our previous experiences tell us. Associative processing is stable and is being built up slowly. Learning new rules is automatic and does not depend on attention, but it requires many common experiences. This allows it to run faster.

Secondly, there is the reflective processing. Reflective processing is more conscious, but slower than associative processing. It is sequential rather than parallel. It can be influenced by a single experience. Learning new rules depends on attention.

Thirdly, there is the question of when to use associative and when to use reflective processing. Most dual process models agree that associative processing is automatically used. Reflective processing is discussed when expectations are not met. Associative processing is dominant until an unexpected stimulus is discussed, or when associative processing does not lead to the intended outcome. In those cases, they switch to reflective processing. Even when expectations do not come true, there are cases where reflective processing is not possible. Reflective processing takes effort and if cognitive means are limited, reflective processing can be disturbed.

The interaction between associative and reflective processing is important for the regulation of emotions. Forgas (2000) states that associative processing is used to maintain a certain emotional state by bringing in information that is congruent with the current emotion. Reflective processing is used to change emotion by bringing in information that is incongruent with current emotion.

A Dual Process Model for Cognitive Vulnerability to Depression

There is a negative bias with regard to one's self-image during associative processing. This has a cognitive vulnerability to depression. There is a lot of evidence that how a person feels about himself contributes a lot to vulnerability to depression. A self-schema is an organized representation of one's previous experiences. Beck (1979) stated that vulnerability to depression is increased when self-schemas are negative.

When someone has a negative associative processing, reflective processing can still happen to this negative bias. There are at least three cases where associative processing is not corrected. First, when biased associative processing violates expectations, but the cognitive means are not present to switch to reflective processing. In the absence of cognitive burden, people who are vulnerable to depression can use reflective processing to correct their negative associative processing. However, when the cognitive burden is increased through time pressure, a competitive task (such as remembering 6 numbers) or stress, one cannot fall back on reflective processing. The result is that the person's cognitive vulnerability becomes visible.

Secondly, when biased associative processing does not violate anyone's expectations and therefore there is no need for reflective processing. Biased associative processing does not trigger reflective processing in depressed people. Depressed people respond the same to functional and dysfunctional statements, while non-depressed people make more use of reflective processing when expectations are violated.

And third, when biased associative processing violates expectations, but that reflective processing cannot adequately adjust negative bias associative processing. An example is self-reflection. It is often thought that thinking about depression, wondering where the symptoms come from and what the consequences of the symptoms are helps to overcome depression. However, this is not the case. Thinking about depression actually increases depression.

Under each of these circumstances, associative processing is not adequately corrected and therefore a cognitive vulnerability to depression arises. If associative processing remains uncorrected, a negative spiral is created in which the vulnerability to depression becomes worse. The more negative one is in life, the less cognitive means are available to switch to reflective processing. This dual process model emphasizes a negative bias about one's self-image in associative processing and the inability to use reflective processing to correct such biases.

Development of Cognitive Vulnerability

Associative processing is influenced by the rules that have gradually been formed. However, some experiences may have more influence on forming an associative memory network than others. These experiences can be divided into early experiences, affective experiences and cultural biases.

The experiences that someone has early in his / her life have more influence on the formation of an associative memory network than experiences that someone experiences later in life. Depression can develop in people who have experienced experiences early in life that have a negative influence on their self-image.

Affective experiences also have an important role in forming associative processing. Evidence has been found for the hypothesis that associative processing is correlated with parts of the brain that regulate emotional experiences. This suggests that a negative affective experience can have a lot of influence on associative processing.

Finally, cultural beliefs influence the formation of associative processing. Cultural beliefs are often shared by an entire social community. People who belong to a minority group often have the same opinion as the rest of the community. Evidence has been found that African Americans and Whites both have implicit prejudices against African Americans. African-Americans have an associative bias against their own race because they have adopted the negative opinion of society.

Overcoming Cognitive Vulnerability

A dual process model provides some ways in which one can correct for their cognitive vulnerability. Firstly, expectations can be adjusted so that negative bias associative processing gives a trigger to switch to reflective processing. The correct cognitive means must then be present to switch to reflective processing. Another way is to adopt a reflective attitude and realize that the outcome of associative processing is only one perspective and does not necessarily have to be the truth. These strategies only affect the short term. An ideal strategy would be to influence associative processing in such a way that it no longer has a negative outcome. The repeated transition to reflective processing can change associative processing. This process is called consolidation. Consolidation occurs when a person has gained enough experience with a certain association that this new association is integrated into the associative system. However, this process can take weeks to years. This shows that repeated exposure to new associations is needed to consolidate the results from reflective processing to associative processing. A cognitive vulnerability can thus be reduced by repeatedly using reflective processing to correct the bias that results from associative processing.

Conclusion

This article examines a dual process model for the cognitive vulnerability to depression. This model emphasizes the interaction between two models of information processing, namely an associative and a reflective processing model. Associative processing is a fast, automatic form of information processing, while reflective processing is slower and requires more effort. It is assumed that negative associative processing, in particular about self-esteem, increases cognitive vulnerability to depression. However, this bias can be corrected by using reflective processing if the cognitive means allow this. Therefore, a cognitive vulnerability to depression is mainly seen when the cognitive means do not allow this correction.

Uncorrected negative associative processing leads to a depressed mood. As determined by Forgas and Ciarrochi (2002), reflective processing is important for the regulation of depressive moods. If there is no reflection, this depressive mood will only get worse.

Access: 
Public
Article summary with Restrained eaters show enhanced automatic approach tendencies towards food by Veenstra & de Jong - 2010

Article summary with Restrained eaters show enhanced automatic approach tendencies towards food by Veenstra & de Jong - 2010

Introduction

Restrained eaters have a relatively strong positive automatic association with food that is high in fat content, but their explicit evaluation of food with high fat content is more negative. This dissociation explains why they have a contradictory diet, where they eat a lot of diets on the one hand and often eat too much on the other. This study looked at the automatic approach tendency that people had with regard to food with much or little fat. It has been shown in another study that people who eat too much have a stronger tendency to approach than normal eaters. People who eat too much are approaching food relatively quickly and relatively slow when they have to avoid it. The task in this study was to move a puppet to a stimulus (picture of food) or away from a stimulus, using arrow keys. The study looked at the automatic food-related approach tendencies and affective associations on the disordered food intake.

Method

To investigate the automatic approach, the Affective Simon Task manikin version (AST manikin) was used. Subjects were shown a stimulus and had to use the arrow keys to move a puppet towards or away from the stimulus. To determine the affective association that someone with a stimulus had, the Affective Simon Task voice key (AST voice key) was used. When seeing a stimulus, test subjects had to indicate whether they thought it was nice or dirty, by saying this loudly.

Results

Limited and unlimited eaters did not differ in their self-report on how often they ate food that was high and low in fat content. They also did not differ in their degree of hunger during the experiment.

AST voice task

There were a total of three stimulus types: high-fat food, low-fat food, neutral food. There were 2 response types: tasty or dirty. Subjects were faster when they liked something than when they found something dirty. They were faster with high-fat stimuli than low-fat stimuli and neutral stimuli. An interaction effect was found between stimulus type and response. So there was a different reaction pattern for high-fat food, low-fat food and neutral stimuli. However, limited and unlimited eaters did not differ. In general, test subjects found eating much higher fat to be better than eating low fat.

AST manikin

There were three types of stimulus for this task: high-fat food, low-fat food and neutral stimuli. There were two response types: approach or avoidance. And there were two groups: limited or unlimited eaters. A main effect has been found; In general, test subjects were faster when they had to move the puppet towards the stimulus than when they had to move the puppet away from the stimulus. An interaction effect was also found between stimulus type, response and group. Limited and unlimited eaters showed a different reaction pattern on food and on neutral stimuli. Thus, limited eaters showed an improved approach tendency for food, while such approach tendency was absent in unlimited eaters.

Limited and unlimited eaters reported different craving patterns. Unlimited eaters reported more craving for low fat food than for high fat food. While limited eaters indicated no difference in their self-reports between craving for high fat and low fat food. There is a small correlation between the automatic positive responses and automatic approach tendencies for high fat eating.

Discussion

This study looked at the role of improved approach tendencies and affective associations in overeating. Regardless of whether people eat in a limited way or not, test subjects all show a stronger automatic affective association with eating a lot of fat than eating low fat. While a preference for such high-fat eating was absent in the explicit self-report evaluations. Limited eaters reported the same craving for low fat eating as for high fat eating, while unlimited eaters had a stronger craving for low fat eating than for high fat eating.

This research provides no evidence for the hypothesis that strong automatic positive affective associations with food play an important role in the disregulation of food intake in limited eaters. In line with the incentive-sensitization theory, this study shows relatively strong approach tendencies for food in limited eaters. These results also agree with Robinson and Berridge (1993) hypothesis that motivational aspects in relation to food play an important role in the disregulation of food intake.

Contrary to expectations, no other approach pattern was found in limited eaters for high-fat and low-fat foods. Apparently, limited eaters have a motivational orientation towards eating in general and not so much a strong motivational orientation towards eating high fat.

Limited eaters showed the same amount of craving for eating a lot of fat as for low-fat eating. While unlimited eaters had less craving for high than for low fat. This lesser craving for eating high fat in unlimited eaters is perhaps a protection against overeating.

This study showed that in normal eating, affective associations with eating are stronger for eating high fat than for eating low fat, whereas no stronger motivation for eating high fat was found. Even though people who ate too much showed the same pattern of affective associations, they showed stronger automatic approach tendencies for eating high and low fat. These improved approaches to eating probably contribute to their dysfunctional eating pattern.

Access: 
Public
Article summary with Pavlovian influences over food and drug intake by Woods & Ramsay - 2000

Article summary with Pavlovian influences over food and drug intake by Woods & Ramsay - 2000

Disruption of homeostasis

Intake of food and drugs have in common that they cause the disruption of homeostasis in our body. There are a number of regulatory systems that are linked to the change of a certain physiological parameter, such as body temperature and the level of glucose in the blood. These systems can then initiate neural responses when a change occurs. Eventually, they can anticipate the disruption of homeostasis and respond to it in advance.

When drugs and food enter our bodies, they act on countless biological processes and thus bring about change. Some of these effects are wanted, others not. Through classical conditioning, people are able to maintain the positive effects of taking drugs and food by minimizing their negative effects. This is a learning principle that is subject to all regulatory systems in our body.z

Biological consequences of food

It used to be thought that the reason people eat is that the level of fuel for energy in critical tissues in the brain is too low. These are the so-called "depletion-repletion" models and there is something to argue with. People do not only eat when they are hungry, so when they still have enough energy. In addition, the balancing processes of eating food take place too slowly. This does not mean that the body itself does not initiate processes if the glucose level is low. That is indeed the case. A low level of glucose causes a number of regulatory responses to take place that have the consequence that a glucose secretion takes place in the blood immediately from the body itself.

However, food and the level of glucose in the body are related to each other. Eating increases the level of glucose in the blood when carbohydrates come from the stomach. To deal with this, the body secretes insulin from the kidneys on which less glucose is released from the liver into the blood. This will restore the balance. It has been discovered that this release of insulin can be brought under stimulus control using classical conditioning.

This means that if a person is confronted with stimuli (CS) that imply that food intake will follow (UCS), the body will release insulin before food is taken (CR) to prevent excessive levels of glucose in the blood. The body gives a kind of anticipated counter-reaction.

In addition to the level of glucose, there are other parameters that are regulated as a result of eating. These are, for example, body temperature and metabolic level for which a counter-reaction is given before the food is taken.

It is important with these findings that people can be flexible with their food intake and that they do not have to do so at set times. Normally someone eats several small meals a day. However, if a person's lifestyle restricts them to taking two meals a day, they must be large to maintain body weight. If no equilibrium restoring processes were to take place in the body, for example, the glucose level would rise considerably. However, a person can only eat large meals if the meal can be predicted.

The influence of drugs on regulatory systems

Even when drugs are taken, there is an interaction with tissues and molecules in the body. The effects of exogenously administered drugs that bind to certain receptors are similar to the effects of endogenous hormones and neurotransmitters. The exception is that the effects of drugs come about in unusual ways or combinations. Drugs can change normal physiology in countless ways and most drugs work in different ways at the same time, creating the well-known drug effect. So just like food, drugs cause certain parameters in the body to change, thereby disrupting homeostasis.

In the case of drugs, these disruptions can also be corrected by learned anticipated responses. Here too the classical conditioning principle seems to be at work. The conditioned stimulus is the environmental cue. The unconditioned stimulus is the drug effect. After repeated offers from the CS-UCS, a conditioned response is elicited by the CS in the absence of the drug. Taking drugs, for example, can cause body temperature to fall (hypothermia) and this is the drug effect.

The unconditioned response (UR) is then the processes that cause the body temperature to rise again. After a few pairings of the CS with the drug effect, the CS alone provokes an increase in body temperature (CR). The net result of the drug intake is then zero and the person has become tolerant.

Research with ethanol has also shown that tolerance for hypothermia due to the ethanol stimulus situation is specific. So a cue is needed that predicts the drug effect that causes tolerance to occur. Taking the drugs on its own is not enough. The condition for tolerance is therefore that there is an association between a regulatory neural reflex (UR) with a certain CS.

In an experiment with rats, it has been shown that they can become tolerant to motor disco coordination based on classical conditioning. The experiment consists of three groups of rats who all learned how to walk on a treadmill (CS) without making mistakes.

The experimental group was then placed in a narrow cylinder which was then rotated so that the rats became dizzy (motor disc coordination = UCS). Every adjustment to the engine control system is then an UR. When the rats were placed on the treadmill, they made many mistakes, but after repeating this procedure, they became tolerant and made fewer errors (CR).

The control group was also placed in a cylinder, but it was not turned. The last group was put in the cylinder that was turned around, but never just before they had to walk on the treadmill.

When the experimental rats proved tolerant, all three groups were spun around in the cylinder. The experimental group then made significantly fewer errors on the treadmill. If instead of spinning the cylinder, ethanol is administered to all rats, the effect appears to be the same. Rats that were tolerant of the motor-coordinating effect of the rotating cylinder were also tolerant of the same effects but then provoked by ethanol. This phenomenon is called cross-tolerance. So it can be that someone is tolerant of a certain drug without ever having taken it.

Access: 
Public
Article summary with A learning model of binge eating: Cue reactivity and cue exposure by Jansen - 1998

Article summary with A learning model of binge eating: Cue reactivity and cue exposure by Jansen - 1998

Introduction

During exposure, the cue is presented and the associated fear and/or avoidance behavior is prevented. The hypothesis of this article is that craving and excessive eating (binge eating), just like fear and avoidance behavior, are triggered by a cue and can therefore be treated well with exposure. An eating binge is excessive eating in a short period of time in which the person cannot control this. There are strong feelings of craving before the binge and people feel guilty after the binge.

Cue reactivity

Binge eating can be seen as a drug addiction. Relapse is very common here. If one returns to a situation in which he or she used drugs after rehab, he or she is confronted with the memories of this and the craving gets bigger. Cue reactivity is the response one has to a cue. This behavior is usually classically conditioned. Cues, such as taste, intake rituals and the environment, that were almost always present during the use of drugs will ultimately predict the effects of the drugs. This is the same with binge food. Cues that are not always present during a binge have less influence on getting a binge.

Three models have been developed that have elaborated the relationship between drug addiction and classical conditioning. These models have one common assumption, namely that cue reactivity predicts relapse. Cues, the conditioned stimuli, provoke a reaction (cue-reactivity). This response is the conditioned response. The conditioned withdrawal model from Wikler states that the conditioned response (CR) is the same as the unconditioned state of withdrawal. The conditioned compensation response model from Siegel states that the CR is the opposite of the unconditioned effects of the drug. The conditioned appetite motivation model from Stewart says that the CR is the same as the unconditioned effects of the drug.

These three classic conditioning models about relapse state that if one associates cues from the environment (CS) with drug use (US) for a longer period of time, these cues influence drug use. They cause physical reactions in the addict, such as craving, which causes them to relapse more quickly into drug use.

Binge eating and cue reactivity

You can compare binge eating with a drug addiction. Here, food intake is the unconditioned stimulus (US), the metabolic response to food the unconditioned response (UR). Food cues, such as smell and taste, can become a conditioned stimulus (CS). These cues can evoke cue reactivity; these are the conditioned reactions (CR). It is assumed that the learned cue reactivity increases the chance of binge eating. A strong US gives a lot of cue reactivity (CR), which leads to strong conditioning.

Predictions

The assumptions of the classical conditioning model are:

  • Food intake (US) in combination with strong cues from the environment (CS) lead to cue reactivity; a huge urge to want to eat.
  • Just the thought of CS will lead to cue reactivity.
  • The provocation of cue reactivity in normal eaters leads to an enormous urge to eat.
  • Treatments that cannot lower cuereactivity have greater relapses than treatments that can lower cue-reactivity.

Cognitive behavioral therapy with in vivo exposure is the best treatment for bulimia nervosa. CBT breaks through classically conditioned reactions, because an objective of the treatment is to develop a normal diet. This reduces the relationship between cues and binge eating, because the type of food is now also eaten without the cues. Eventually the cue reactivity will decrease and with it the need to eat.

Cue exposure with behavioral prevention: Practical aspects

The learning model of binge eating states that cues from the environment elicit reactivity, as long as the cues are reliable predictors for binge eating. So as long as the CS is systematically strengthened by the US. The model predicts that cue-reactivity will be extinguished when the CS-US relationship is broken. This relationship is broken if one is exposed to the cues, but avoids the binge eating that normally follows. This form of treatment corresponds to the treatment for phobias and OCD. During the exposure the cues are presented, while eating is avoided (response prevention). The purpose of the exposure is to generate a strong eating requirement. A disadvantage is that the therapist is a safety signal during these sessions; a cue not to eat. The patient must ultimately also be able to do without the therapist.

The exposures work better in vivo than in vitro. Exposure works better with flooding when it comes to binge eating. It causes greater craving, so ignoring this craving has a greater effect than if the craving were to be less large. Exposures of 50-90 minutes work best. The exposure sessions must often occur and in rapid succession. Five times a week works better than fewer times a week.

Cue exposure with behavioral prevention: Empirical evidence

Drummond & Glautier (1994) found that subjects who underwent exposure had better control of their binge eating than subjects who received no exposure. They had less relapse and ate less. No differences were found in failure between the two treatment methods. However, Monti et al. (1993) found these differences. They found higher drop-out rates in the group treated with exposure and coping training compared to the control group that received standard treatment. Cue exposure leads to significant decreases in craving, while the physiological responses to cues did not disappear. Cue exposure works primarily for reducing the need to eat during treatment.

So there are two ways to break the relationship between CS-US. Firstly, let people induce craving and not eat it. Secondly, people eat what they eat during a binge in places where they would not normally eat it. The conditioning model predicts that cue does not reduce reactivity and craving when avoiding the cues. Only exposure of the cues in combination with prevention of binge eating will reduce craving.

Conclusion

The learning model states that cues that precede binge eating (smell, taste, etc.) become conditioned stimuli that elicit cue reactivity/conditioned reactions.

If people suffer from binge eating, they will also have a depression more quickly. It is not smart to combine cue exposure with antidepressants. Although antidepressants reduce craving, and therefore probably reduce cue reactivity, this also ensures that the association between cues and binge eating is maintained. As soon as antidepressants are stopped, the cue reactivity becomes higher again and there will be an increase in binge eating.

Patients with anorexia should not do cue exposure. Their binge food is the only food they eat. Patients with bulimia should also not do cue exposure. As a result, you learn to reject food and this can go on to anorexia.

The best treatment is a combination of cue exposure with interventions that aim to develop normal eating habits and eliminate dysfunctional thoughts. Binge food must be treated with cue exposure. Dysfunctional thoughts should be treated with CBT.

Access: 
Public
Article summary by A cognitive approach to panic by Clark - 1986

Article summary by A cognitive approach to panic by Clark - 1986

[toc]

It has long been accepted that panic attacks often occur in certain types of anxiety disorders. Panic attacks later became a separate subject of investigation, and Donald Klein is partly responsible for this. He showed that people with anxiety disorder with panic attacks respond to imipramine and people with anxiety disorder without panic attacks do not: the so-called pharmacological dissociation. The DSM-III then accepted this distinction by including "panic disorder" and "agoraphobia with panic" in the diagnoses.

A panic attack consists of a strong sense of fear or calamity that suddenly arises and is combined with many different physical characteristics. These can be: breathlessness, palpitations, chest pain, dizziness, tingling in feet and hands, feeling of suffocation, fainting, sweating, shaking and a feeling of unreality. The majority of people who suffer from panic attacks have a panic disorder or agoraphobia with panic. A panic disorder is only diagnosed if there have been at least three panic attacks in the last three weeks and these panic attacks do not only occur in certain anxious situations. Agoraphobia with panic is diagnosed when someone shows fear and avoidance of a certain type of situation and has a history of panic attacks.

Panic attacks can be triggered by various pharmacological and physiological agents such as caffeine, carbon dioxide and yohimbine. These drugs hardly cause panic attacks in people without a history of panic attacks, but do cause some sensations that are characteristic of a panic attack. This shows that certain biochemical changes have a panic-inducing effect and that people who are sensitive to these drugs therefore have a biochemical disorder. However, a psychological explanation has also been found for this phenomenon: the drugs do not have a direct panic-inducing effect, but only cause panic if the physical sensations are interpreted in a certain way. This is essential for the cognitive theory described in this article. It states that panic attacks are the result of catastrophic misinterpretations of certain bodily sensations. These are usually sensations such as dizziness and lack of breath (which normally occur with anxiety), but can also be other sensations. This misinterpretation means that the sensations are experienced as much worse than they are. The order of a panic attack is as follows. Various incentives can cause panic attacks. These can be external incentives, but are more often internal incentives. When these stimuli are perceived as a threat, fear arises, along with a number of physical sensations. If this sensation is seen as catastrophic, more fear arises. This creates more physical sensations, etc.

Some panic attacks are preceded by a period of heightened anxiety, some not and come "suddenly." In the first case there are two types of panic attacks. In the first type, the period of increased anxiety is caused by the expectation of an attack, for example in a busy place for someone with agoraphobia. In the second type, the period of increased anxiety is not caused by the expectation of an attack. In the case of panic attacks that appear to be sudden, the cause of an attack is usually the perception of a physical sensation caused by an emotion (anger, for example) or by a harmless event such as getting up, exercising, or drinking coffee. The sensation is then interpreted in a catastrophic way. Other sensations that can cause a panic attack include breath shortage due to movement or sensations that are normally not part of a panic attack. In addition, sensations that arise from the perception of mental processes can also contribute to the vicious circle of panic attacks (for example, the fear of going crazy if their minds get stuck). For some patients, the sensations and associated interpretations remain constant over time, for some, these change.

The literature supports the model on the following points:

  • Essential components of panic anxiety. The thinking of patients with panic attacks is dominated by thoughts that cause a catastrophic interpretation of sensations, such as thoughts of death or illnesses.
  • The order of events in a panic attack. Patients often mention a physical feeling as one of the first things they feel during a period of anxiety.
  • The role of hyperventilation in panic attacks. The sensations associated with hyperventilation often resemble the sensations associated with a panic attack. In some patients, voluntary hyperventilation does indeed cause a "panic-like" feeling and hyperventilation occurs with panic attacks. Hyperventilation only leads to panic if the physical sensations are experienced as unpleasant and are interpreted in a catastrophic way.
  • Panic can be caused by sodium lactate. Patients with panic attacks interpret the feeling caused by sodium lactate in a catastrophic way; 60-90% of patients get an attack when they are given sodium lactate.
  • Effects of psychological treatment. The most used and successful treatments are behavioral and behavioral cognitive treatment. CO2 inhalation is an example of a treatment, because CO2 inhalation is very effective in inducing panic attacks. This allows the patients to "get used" to the feeling. Neither the behavioral cognitive treatment of Clark nor the behavioral treatment of Griez and van den Hout has been compared with other psychological treatments.

This makes it difficult to say anything about the cause of the effectiveness. In contrast to the treatment of panic disorders, there is a generally accepted treatment for agoraphobia with panic, namely in vivo exposure. The question is whether behavioral cognitive and behavioral treatments can add anything to this.

  • The role of biological factors in panic. There are three ways in which biological factors influence the vicious circle of panic attacks. First, biological factors can influence the cause of an attack. Secondly, biological factors can influence the extent to which a threat brings about changes in physical sensations. Thirdly, biological factors can play a role in the extent to which sensations are interpreted in a catastrophic way.
  • Effects of pharmacological treatment. Three drugs have been investigated, showing that propranolol is ineffective and diazepam is not effective in the long term. Imipramine appears to be effective, but is always offered in combination with gradual exposure to dreaded stimuli. It is therefore possible that imipramine has no direct effects.

Further evaluation of the model can follow if the following predictions are investigated:

  • Patients with panic attacks interpret physical sensations more catastrophically than anxious patients and the control group.
  • Pharmacological agents only cause a panic attack if the sensations are catastrophically interpreted and the effects of these agents can be blocked by manipulations.
  • Treatments that do not change the tendency for catastrophic interpretation in patients are more likely to experience relapse than treatments that do.
Access: 
Public
Article summary by EMDR: Eye movements superior to beeps in taxing working memory and reducing vividness of recollections by Van den Hout & Engelhard - 2010

Article summary by EMDR: Eye movements superior to beeps in taxing working memory and reducing vividness of recollections by Van den Hout & Engelhard - 2010

What is EMDR?

EMDR (eye movement desensitization and reprocessing) was introduced about twenty years ago as a treatment method for PTSD. EMDR appears to be effective for PTSD, just as effective as cognitive behavioral therapy. In addition, significant effects of eye movements have been found, whereby eye movements during the recall of aversive memories diminish the vividness and emotionality of the memories. They are "labile" during recalling memories, which means they are influenced by the experiences they have during recalling. We use our working memory to recall memories, and the capacity of this working memory is limited. If we perform a second task while recalling a reminder, there is less capacity available for recalling. This reduces the liveliness and emotionality. Eye movements are such a "second task". EMDR uses the unstable state of reminders during recall and the fact that reconsolidation is influenced by the nature of recall reminders.

Horizontal and vertical eye movements are equally effective. Tasks that hardly need any capacity (such as simple finger tapping) have no effect, but more complex movements do. Not only can EMDR be used for traumas, it can also deal with negative ideas about future events. In addition, people who are bad at multitasking benefit more from, for example, eye movements while calling.

The main memory consists of three subsystems:

  • Central executive (CE): the allocation and distribution of attention between tasks, activation of memory, recall of memories, inhibition of distraction and selection of retrieval strategies.
  • Visuospatial sketchpad (VSSP): implement visuospatial information.
  • Phonological loop (PL): implement verbal information.

Which components are influenced by tasks such as eye movements? In general, the theory states that there is modality specificity. Eye movements should affect the VSSP and verbal tasks on the PL. Research has found that eye movements reduce the vividness of images because they temporarily disrupt the preservation and manipulation of traumatic images in the VSSP. In general, it has been found that eye movements and counting, for example, have memory effects. These effects are both general, with influence on the CE, and modality-specific, with influence on the VSSP or PL.

An alternative to eye movements is binaural stimulation: the patient wears headphones and hears beeps right and left. No verified data has been reported for this method. The question is whether this way appeals to the WM because it does not require active cognitive or motor actions.

In five experiments the researchers of this article look at the effect of binaural stimulation and eye movements.

  • In experiment 1 it was found that eye movements cause a substantial delay in the reaction time (in response to an auditory stimulus), which implies that eye movements influence the CE.
  • In experiment 2 it was found that doing sums reduces the response time to visual stimuli.
  • In experiment 3, a random interval repetition task (RIR) was used, in which reaction times were viewed, and the auditory stimuli were randomly offered. Response times for such a task are shorter than for other tasks. Substantially shorter reaction times were indeed found in this experiment than in experiments 1 and 2. This task may therefore be more sensitive to CE taxing. In this experiment it was also found that eye movements cause interference. CE sources also use binaural stimulation, but this effect is small compared to eye movements.
  • In experiment 4 it was found that eye movements and the accompanying taxing of the WM decrease the liveliness and emotionality. Binaural stimulation has relatively little effect on the vividness of memories.

About 50% of EMDR treatments are performed using binaural information instead of eye movements. Binaural stimulation does indeed use the CE, but four times as little as eye movements do. Eye movements do not have stronger effects for emotionality than other methods, but for liveliness they do. Binaural stimulation also has an effect here, but this is only one third of the effect of eye movements. EMDR sessions are much longer in clinical practice, and it seems logical that the effects are also increased. It is not logical that an auditory task uses the VSSP, just as it is not logical that a visual task uses the PL. The data found in the experiments support the CE explanation of the eye movement part of EMDR.

There are a number of limitations to the experiments performed here. First, the clinical effects of binaural stimulation on eye movements are not tested. In addition, the quality of negative memories is assessed by self-reporting. Moreover, the long-term effects have not been tested.

Access: 
Public
Article summary with Continuities and discontinuities between childhood and adult life by Rutter a.o. - 2006

Article summary with Continuities and discontinuities between childhood and adult life by Rutter a.o. - 2006

Introduction

One generation ago there were few psychiatrists and psychologists who approached mental disorders with a developmental perspective. Nowadays this is different. The development perspective is very popular and is therefore frequently used.

Causes of schizophrenia in childhood

Originally, schizophrenia is conceptualized as a psychosis that generally begins in adolescence or early adulthood. Children who later developed schizophrenia, however, already had social, emotional and behavioral problems in their childhood. For example, poor attention in children is a predictor of later schizophrenia. Epidemiological and longitudinal studies of the entire population have also shown that there are predictors of schizophrenia in early childhood, despite the fact that schizophrenia itself does not become public until adolescence or early adulthood. These predictors include a delay in motor development, difficulties in understanding language and cognitive functioning. These associations with schizophrenia are independent of socio-economic, obstetric and educational effects. Research has also shown that schizophrenia in adults is often preceded by social-emotional behavioral problems. However, this also applies to many other psychopathological disorders and therefore it is not possible to identify specific predictors of schizophrenia.

There are three problems that seek a solution when it comes to predicting schizophrenia from childhood and adolescence traits. First of all, a distinction is often made between predictors (risk factors) and prodromal (early manifestations of a disorder) factors. In general, this classification is also correct, but there is also a large proportion of people who already exhibit prodromal factors, but who do not continue to develop the disorder afterwards. Secondly, the question is what causes it that these risk factors or early manifestations sometimes lead to the occurrence of a disorder sometimes and sometimes not. Three possible answers for this are that brain development in adolescence is crucial, that excessive use of cannabis contributes to the development of the disorder (only if you already have risk factors or early manifestations of schizophrenia), and that certain types of social removal, such as relocation and insulation, make a major contribution.

Neuro-developmental disorders

Neuro-developmental disorders have eight main characteristics. The first is that they reveal themselves through a backlog or a deviation in psychological traits that are influenced by the growth process. Furthermore, the course of the disorder does not contain any periods in which things go better. In general, the disorder becomes a little less severe, the more mature a person becomes, but the disappearance, the disorder almost never does. Furthermore, the disorders all contain a specific or general cognitive impairment. There is also a lot of overlap between the various disorders. The genetic influence on these disorders is very large, but environmental influences also play a role. Finally, it is striking that especially men have this category of disorders.

Autism and related disorders

Sometimes characteristics of autism are already visible in the first year of life of a child, but usually characteristics are not really visible for 18 months and a diagnosis can usually only be made from 2 years. The best predictor for autism is the IQ and the use of language at the age of 5 years. Yet IQ is not a very good predictor, because many children with an IQ above 70 could develop an autistic disorder.

Specific language disorder

The name suggests that this disorder only causes language problems, but this is not entirely true. But one in six people with this disorder has a permanent paid job and one in six has never had a paid job. The majority do not live independently, more than half have problems with friendships and just over a quarter have ever had a long-term relationship. Even though the language problems persist into adulthood, the biggest problem is actually social functioning and social relationships.

Research into this disorder has revealed a number of striking things. Children who have problems with language development at an early age can often catch up with this backlog, but the problems often come back when these children go to school and then the problems persist well into adulthood. It also appeared that there is a certain overlap between autism and a specific language disorder. Genetic factors play an important role, especially in a somewhat worse form of the disorder. Furthermore, this disorder should not be referred to as a language disorder, but rather as a more general social / cognitive disorder.

Dyslexia

Dyslexia can only be diagnosed after a child has reached an age at which the child should normally be able to read. Yet dyslexia is largely genetic and there are also a number of characteristics (cognitive and linguistic) visible in the years before a child goes to school.

ADHD

ADHD is strongly influenced by genetic factors, is mainly found in men, is associated with many other cognitive defects and is thought to originate in problems with cognitive processing. The problems lie mainly in poor control of behavior, problems with the executive functions of inhibition and working memory and an aversion to slowness.

ADHD often occurs together with ODD and CD, probably because they have common underlying genes. In addition, children with ADHD have a greater chance of developing psychopathological disorders later. The prognosis for ADHD is poor, especially when someone shows hyperactivity at school or at school and at home. ADHD certainly has effects in adulthood, but these often no longer meet the diagnostic criteria that are set for ADHD in children.

Depression

If depression occurs in adolescence, the chance of more depression in adulthood is high. It is often assumed that this chance is greater for women, but this is probably not the case. If depression occurs before puberty, there is no great chance of developing depression later. Therefore, it is thought that depression that occurs in childhood differs significantly from depression that occurs in adolescence or adulthood. This idea is supported by the fact that children and adults respond very differently to antidepressants.

Heredity plays a greater role in depression that develops in adolescence or adulthood than in depression that develops in childhood. Probably this is because depression in adolescence is mainly triggered by negative experiences and the extent to which someone gets caught up in it is genetically determined.

Negative experiences can cause depression in different ways. Children and adults who are depressed behave in such a way that they increase the chance that they will end up in situations where they will experience stress, which will make the depression worse. Cognitive biases are also likely to play a major role in vulnerability to depression. Children can have a bias for negative statements and memories from the age of five.

Comorbidity

Anxiety often starts in childhood and is a predictor of major depression in adulthood. Probably the same genes underlie this, but the expression of these genes is different in different stages of life. Particularly a separation disorder, a general anxiety disorder and a panic disorder are good predictors for a later major depression. There are also various indications that a major depression in adolescence is precisely a predictor of anxiety in adulthood and especially of a general anxiety disorder.

The question remains whether anxiety and depression are two different disorders, whether they have the same underlying problem or whether they are in a causal relationship with each other. The conclusion that, in general, anxiety disorders are a risk factor in the development of depression can be drawn. This relationship is probably mediated by hormonal influences in puberty and an increased chance of negative experiences and cognitions.

Antisocial behavior

Most adults who exhibit antisocial behavior have a long history of behavioral problems. Yet it is not the case that all children with behavioral problems start to show antisocial behavior later in their adulthood.

Heterogeneity

Antisocial youth can be divided into two groups, which differ in when the behavior arose, when risk factors were already present and in how long the behavior remains present. The poor prognosis is for people who have problems early in childhood, who are male, who have neuro-development problems and who have a negative environment. Antisocial behavior that only emerges in adolescence has a better prognosis, since abnormal behavior is a little part of this phase and the behavior generally disappears when these young people start to take their responsibilities.

Not all people who show behavioral problems at an early age also show antisocial behavior in adulthood. A small group is developing a positive adaptation style, but it is not entirely clear why this is. There is also a group that exhibits behavioral problems early on, but where the later problems are of a completely different kind. These people show symptoms of social isolation, poor development of friendships and a vulnerability to anxiety and depression in adulthood. This group often shows an aversion to social situations and neuropsychological problems in childhood. It seems logical that it is simply a question of personal characteristics or that early negative experiences and problems lead to avoidance behavior or, conversely, to openly antisocial behavior.

It is also questionable whether the prognosis for people who show antisocial behavior until adolescence is really so much better than for an earlier start of this behavior. After all, these people show much more criminal behavior and substance abuse later on, and problems with mental health also often occur.

Persistence and removal mechanisms

Genetic factors play a major role in the origin and continuation of antisocial behavior, but there is certainly interaction with the environment. Genetic factors can cause people to react negatively to the person who is someone and can determine in which situations someone comes.

If such a person comes into contact with stable, positive relationships, the early process of behavioral problems can be turned around positively. The extent to which someone benefits from all kinds of positive circumstances also depends on the person himself. Change moments can be associated with moments when someone's past is really separated from the future, such as new opportunities for relationships, social networks, new options for controlling behavior, structural activities and a new identity formation.

Risks of psychiatric disorder in adulthood

Antisocial behavior in childhood and adulthood is strongly associated with later psychiatric disorders. Behavioral problems at the age of 11-15 increase the risk of all psychiatric disorders, both internal and external, at the age of 26. Many factors probably play a role in this, such as encouraged substance abuse by friends, poor family ties and a need for money for drugs and alcohol. Behavioral problems often have different dysfunctions, both in behavioral and emotional areas.

Substance abuse

The vast majority of young people do experiment with alcohol and drugs, but in general this does not result in substance abuse in adulthood. This result depends, among other things, on other disorders that people may have and especially behavioral problems correlate strongly with later substance abuse. ADHD also increases the chance of later substance abuse, but it is possible that this chance is also mediated by behavioral problems associated with ADHD. The link between depression and substance abuse is more complicated. Depressed people may use alcohol and drugs as a kind of self-medication, but a direct effect of depression on substance dependence has not yet been found. However, the prediction from substance abuse to a later depression is very strong. Substance abuse in adulthood is associated with both hereditary and psychosocial problems.

Umbrella themes

Concepts that are often used when looking at the psychopathology phenomenon are heterotypic continuity and psychopathological progress. These concepts suggest that there is meaningful continuity with the different disorders, but that the manifestations of these may differ at different stages of life. Examples include reading problems in children who predict spelling problems in adulthood, neurodevelopmental problems in young children, psychosis-like symptoms in later childhood and psychoses and schizophrenia in adolescence and adulthood, and early anxiety symptoms as predictors of later depression. However, it is still not entirely clear why anxiety symptoms are a predictor of later depression (most common) or depression of later anxiety disorders. With schizophrenia, the development is even more complicated. Language and motor deficits are predictors of schizophrenia, but there is little continuity in this.

The same applies to psychotic symptoms. The predictors of schizophrenia are therefore much more common than schizophrenia itself.

It is not entirely clear how the process from a specific language disorder to a general social disadvantage fits into this picture. It is not logical to see social problems as completely independent of the language disorder, so there are probably social problems before and these probably start with a delay in the use and understanding of language. The term heterotypic continuity is therefore probably also applicable here. The change from early behavioral problems to later substance abuse and from ADHD to a later antisocial personality disorder can probably be better reflected with the concept of psychopathological progress.

Early onset of symptoms

It is often assumed that an early onset of symptoms is a direct cause of the severity of psychopathology. Real structural research into this has not yet been done, so it is not yet possible to say anything with certainty about this statement. In general, psychopathology that has a long history of symptoms is more serious than psychopathology that does not.

Mediators of continuity and discontinuity

Genetic factors most likely play a mediating role in psychopathology. It also plays a role that the chance of a relapse increases if someone has already had an experience with a disorder. In addition, some disorders increase the chance of someone entering a risky environment and negative experiences in childhood or adulthood increase the chance of depression. Furthermore, the way an individual responds to a disorder influences the chance of a later relapse and the way people think about the disorder influences the course of the disorder.

Predictions

First of all, the question is whether later psychopathology includes abnormalities that may be present at an early age. Then it must be examined whether it is at all possible to measure these deviations reliably and validly. False negatives and false positives are very common in research into the early symptoms of psychopathology, so attention needs to be paid to that. A balance must therefore be found for the risk of many versus the benefits for a number of people.

Theoretical perspectives

Development largely involves continuity and change and both concepts contain coherence, laws and organization. There is an unchanging interaction between genes and environment and development in the early years continues to influence later development. There are many individual differences in the course and causes of a disorder and there is an extensive continuity between normality and psychopathology.

Directions for further investigation

Attention must be paid to the alternatives that compete with each other as mediators for psychopathology, and this requires many measurements, research designs and data analyzes. There is a particular need to find factors that are important in the changes in adolescence. Until now, genetic designs have not been used much and it is good to make more use of them in research into the interaction between environment and genes.

Access: 
Public
Article summary by Developmental systems and psychopathology by Sameroff - 2000

Article summary by Developmental systems and psychopathology by Sameroff - 2000

Introduction

A contrast in psychopathology is always that the phenomena are classified as much as possible in a categorical system, while they are very dynamic. Another area of ​​tension is the contrast between studying serious mental disorders versus mental health. When we focus on mental health, we always find extremes on every behavioral scale that resembles pathology, and when we focus on pathology, we find that there are also areas that resemble mental health. If we look at psychopathology from a developmental perspective, it may be that a disorder suddenly becomes nothing more than an adaptive process of an individual for an environmental experience. A final contradiction is found in the interaction between nature and nurture, because when we study the environment we learn more about the individual, but when we study the individual, we learn more about the environment.

Important subjects

First of all, it is good to look at how we actually define pathology. An important topic from the development perspective is the continuity versus the discontinuity of development. Here one can look at differences between individuals, but also differences within an individual. Furthermore, it is difficult to precisely understand how the process of development in individuals can be understood. Should we look at stable traits of the individual independent of the context or should we look at traits that are stable in the context, independent of the individual? Or should we look at the functioning of certain traits in an environment? It is also difficult to understand what exactly the context is. Is it a passive collection of experiences that influence the potential of the individual in collaboration with their genes or does experience change when it comes into contact with individual developmental processes?

In the past, individuals were often not seen as integrated systems of biological, psychological and social functioning, but more divided into biological and behavioral parts. However, this results in three problematic principles that often do not match the reality of psychopathology. First of all, the same underlying cause should cause the same disorder in all individuals (children and adults). Secondly, all symptoms at different ages should have the same underlying cause, and third, all child disorders should lead to the same disorder in adulthood. All these three principles are therefore not applicable to psychopathology. First, the same biological problem can lead to very different behaviors in both children and adults. Secondly, the same symptoms can best arise from very different underlying processes and thirdly, there is little evidence for continuity from child disorders to adulthood disorders.

Developmental psychopathology

Developmental psychopathology arose from the realization that a developmental perspective could contribute much to our ability to understand, treat and prevent psychopathology. The perspectives of developmental psychologists are often about the same phenomena as the perspectives of psychopathology, but the concepts are often approached differently. If these perspectives could be combined, it would therefore provide a broader view of psychopathology.

The only question is whether this model can also apply well to individuals who do not stay on the expected development line. Some individuals seem to face a very good future, while eventually ending up with various psychopathological disorders, while others are very vulnerable and display disturbed behavior in childhood, but simply function well in adulthood. Results that arise from research in the laboratory are not always equally applicable to reality. Scientific knowledge also often does not directly influence behavior, just as genes and atoms do not directly influence human behavior.

So there are still many problems that need to be solved before we can really understand psychopathology. The best model for understanding psychopathology should be one that responds to human behavior in all its complexity.

Studies on high-risk groups

Studies into groups with a high risk factor were the first attempt to transfer anything from the development perspective to the study of child psychopathology. The intention was to have a new perspective, which looked at the entire life course of an individual. Mednick assumed that you can only really say something about underlying factors when you have studied someone before the disorder developed. Because not everyone could be studied, there had to be some sort of selection to be studied. A sample had to be found of people who already had a higher risk of a certain disorder than others.

In the schizophrenia study, a group was then selected whose parents or one of the parents had schizophrenia. The aim of the research was to discover early markers that could eventually lead to schizophrenia. Birth complications, certain motor patterns, attention processes and eye movement patterns were examined.

Three hypotheses were created in advance:

  • Deviant behavior in children could be traced to variables that can be associated with the diagnosis of the parents (schizophrenia in this case).
  • Deviant behavior in children could also be linked to traits associated with mental disorders in general, such as the severity and long-term nature of a disorder.
  • Deviant behavior in children can be linked to the social circumstances in which the child finds itself (excluding the psychopathology of the parents).

Little evidence was found for the first hypothesis. However, the second hypothesis was confirmed fairly strongly. So a mental disorder in general has significant consequences for the behavior of the children. Some evidence was also found for the third hypothesis, namely when looking at the social class in which children grow up.

Despite the fact that there were useful results from the research into high-risk groups, it remained difficult to translate this into reality. Research continues to be needed into the role of environmental experiences, and further research is needed into the development of both children with normal behavior and abnormal behavior.

Psychopathology

There are two basic questions that must be answered before child psychopathology can be understood. What does it mean if you display disturbed behavior and are these children different in type or status? In terms of intelligence, you can distinguish between two types of children with low intelligence. One group is distinguished by a low score on an intelligence test and these children are often considered mentally retarded. This diagnosis is then purely a consequence of the normal distribution and does not depend on the individuals themselves. The second group is also a group that scores low on intelligence tests, yet they differ significantly from the previous group. Their entire biological basis is different, as are the processes in which they develop and the treatments they need to improve their status. Are the children we are concerned about now purely at the lower end of the normal distribution or are they really different from the rest of the population?

Illness is generally associated with personal suffering. And even though this is often the case with adults with a disorder, this is often not the case with children. In children it is often that the environment suffers from the behavior of the child and that it therefore comes into clinical practice. The child often does not belong and that is striking. This does not mean that the children themselves do not experience any negative consequences, but this is usually the result of abuse or neglect. In these cases, the responsibility for the diagnosis is placed with the parents and not with the children. With problems with children you are often confronted with not only individual problems, but with conflicts that the child has with the context and the context with the child.

Individual development

One of the biggest problems for psychology in general and in particular for developmental psychopathology is the use of well-operationalized definitions. These definitions should divide the world into categories that are easy to grasp and that approach behavior so that they are similar to the person's behavior. According to Werner, it is necessary that a thorough evaluation takes place of the factors that influence the development of patterns as well as the different paths through which the same development outcomes can be achieved. It is important to map out the processes that play a role in normal development, because disruptions in these factors can contribute to the development of deviant behavior or the adaptability of an individual.

Characteristics

An adjustment pattern is very difficult to measure, since the individual changes when the environment changes and vice versa. A simpler alternative is to look at the characteristics of a child, because they are easier to classify on the basis of a diagnostic interview or a behavioral questionnaire. First of all, a distinction is made between mentally retarded and mentally disturbed children. In addition, it is examined whether the child acted out of wrong passions or passions or out of a wrong moral sense.

Depression

The question is whether a depressive disorder in children takes the same form as a depressive disorder in adults, or whether high scores on depression mainly differ in quality or quantity from low scores on depression and a new problem is that there is a high correlation between symptoms of depression. depression and symptoms of other disorders. There are three levels of depression that all have different levels of depression, namely depressive mood, depressive syndrome and depressive disorders.

It is almost never the case that children with depression have no other problems. Often emotional problems occur at the same time as behavioral problems. This is a fascinating fact, since it should already be rare for someone to have one disorder, let alone two. For depression, however, comorbidity is the rule rather than the exception. Anxiety disorders are most common in depression and this can be explained by the fact that it is both disorders that belong to the internal processes of people. Yet depression is also common with external disorders, such as conduct disorder, ODD, ADHD and alcohol and drug problems. This comorbidity becomes higher the more severe the depression. A possible explanation for this is that children with depression are more exposed to other risk factors and that this leads to additional negative outcomes.

Conduct disorder

Problems that express themselves externally often have much more influence on the child's environment than problems that are internal. Crimes are often committed by teenagers and young adults, but are not easy to grasp in the categories of mental disorders. Adults with antisocial behavior often have a history of antisocial behavior in childhood, but by no means all children who exhibit antisocial behavior also show this in adulthood. Boys who show purely aggressive behavior are much less likely to show criminal behavior later than boys who are aggressive and hyperactive. These boys are also much more likely to have school problems, relationship problems and problems in conflict management.

With conduct disorder, a distinction is made between three development paths:

  • Authority conflicts (stubborn behavior, detachment and authority avoidance).
  • Secret behavior (destruction, theft).
  • Open behavior (aggression, fighting and violence).

The worse the disorder is, the more likely it is that someone will develop on more than one of the three paths. So the worse a disorder, the more chance there is of comorbidity, but also the more chance there is of multiple forms of the disorder.

Individual development as an adjustment system

Underlying factors do not exist independently of development processes. Children are therefore integrated individuals and not a collection of characteristics. The worse the problems, the more likely it is that more than one area of ​​behavior is involved. Looking at patterns of adaptation this will not give a simple catalog of behavior, but it can lead to a better understanding of how children develop and how they deal with the positive or negative consequences of this development. In this way, every change in the child's life and how it relates to the whole can be taken into account.

As an environment becomes more organized, adaptation problems will decrease and as experiences become more chaotic, adaptation problems will occur more often. The development perspective provides an identification of factors that influence a child's ability to organize and adapt to experiences.

Environment

Research has shown that if the only risk factor for a child is psychopathology of the mother, the child would do well in life. However, if a child has a mother who, in addition to a psychopathological disorder, is also poor, poorly educated, lives without social support and is stressed, the child has a less positive future.

These factors give a poor prognosis for the future of children, even if the mother does not have a psychopathological disorder. Social environment is a greater risk factor than any mental disorder. It is also likely that it is mainly about the quantity (so the quantity) of factors that are present than the quality (ie how bad something is).

However, the majority of children who grow up in low social classes or certain ethnic groups do not develop a disorder. They just get a job, have successful social relationships and raise a new generation of children. These results have meant that attention is no longer focused solely on risk factors, but also on protective factors. However, the distinction between risk and protection factors is by no means always clear, but often a risk factor is the exact opposite of a protection factor. Families in which there were many protective factors had better outcomes than families with many risk factors.

Risk factors

The risk factors for depression, conduct disorder, substance abuse and even schizophrenia all seem the same. Poor environmental factors that influence a certain outcome also influence the other outcomes. This result does not help much further in the investigation of specific risk factors in specific disorders.

Control systems in development

A theory that wants to integrate our knowledge of pathology and our knowledge of development must explain how an individual and the environment interact with each other. Next, we must look at how good or bad adaptation patterns arise from this and how these adaptations have an effect on the future.

At the molecular level we have discovered that despite the fact that every cell has the same genotype, every cell manifests itself differently and has a different history. Also at the level of behavioral genetics we have learned that every family member has his or her own unique environment.

Environment types

Just as there is a biological organization (genotype) that regulates the physical appearance of someone, there is also a social organization that regulates the way in which someone fits in with their environment. This organization is created by family and cultural patterns and is often called environtype (as a counterpart to genotype). The behavior of a child therefore arises from an interaction between the genotype and the environtype. The genotype is the most important from conception to birth. The period from birth to adulthood is mainly characterized by influences of the environtype. Individual factors certainly play a role in this period, but it is mainly the environtype that influences the way in which a child can adapt to all kinds of situations.

Most disorders arise from an active search for a good adaptation to the environment of an individual. The positive diet and the negative influences that the individual experiences will give color to this adaptation. No human outcome is created without a contribution made by experience.

Access: 
Public
Article summary with The small world of psychopathology by Borsboom a.o. - 2011

Article summary with The small world of psychopathology by Borsboom a.o. - 2011

Introduction

Comorbidity is the relationship between two diagnoses. A problem with this method is that symptoms are then seen as passive indicators of hidden disorders. Comorbidity then comes from a single cause that causes the two disorders. In clinical psychology, however, it is true that symptoms are not seen as passive indicators of disorders, but that they themselves can cause other symptoms. There is a direct relationship between symptoms. There is no single cause for comorbidity, but there are a number of ways to comorbidity, and which way that is varies per person. A network approach has been developed for mental disorders and comorbidity. In this approach, symptoms are not seen as indicators of hidden disorders, but as components of a network. Comorbidity comes from a direct relationship between symptoms of multiple disorders. Half (47.4%) of all symptoms in the DSM4 are directly or indirectly connected to a main component. This component of connected symptoms has the characteristics of a "small world".

Results

Study 1: Setting up and analyzing the DSM-IV network

All symptoms from the DSM-IV were represented as nodes. There is a connection between nodes when two symptoms are both among the criteria of a particular disorder. Symptoms of the same disorder get a direct connection, while indirect connections arise when two symptoms both have a shared symptom. Two symptoms have no connection if there is no possibility of moving from one symptom to another through other symptoms.

We expect that symptom groups in the DSM are causally related to each other. Symptoms cluster more together than one can expect based on chance. Half of all symptoms therefore have a connection with each other. And the paths to get from one symptom to another are relatively short.

The degree of a node indicates how many connections the node has with other nodes. The degree distribution of a network provides important information about the network structure. The degree distribution of the DSM network is exponential. This can be seen as a "single-scale network". The symptom with the highest degree is insomnia, followed by psychomotor unrest, psychomotor disorder and depressive mood.

Another important characteristic of knots is their "betweenness". This measures the probability that a node lies between two other nodes. The four symptoms with the highest "betweenness" are irritation, distractibility, anxiety and depression.

Study 2: Empirical data for the network structure

The "small world" feature means that symptom activation is quickly spread throughout the network. The network model explains a significant part of the comorbidity within the DSM-IV. If one has one symptom from the DSM-IV, chances are that he / she will also develop another symptom from the DSM. The average shortest path length between two disorders is equal to the expected number of paths a person has to take to achieve a symptom of disorder A from a symptom of disorder B. The higher this path length is, the further two disorders lie apart in the network.

Study 3

The previous two analyzes show that path lengths and empirical comorbidity values ​​were correlated if one would expect if symptoms had a network structure. However, disorders are time dependent. They are dynamic, and sometimes there is a time criterion for how long the symptoms should be present as a minimum. The network model takes this into account. To show how the network can predict comorbidity, a simulation was performed for two disorders to show that the comorbidity values ​​are consistent with empirical data. Symptom dynamic is an increase in the chance of a certain symptom, when someone has more neighbor symptoms. If someone has depression and a loss of interest, the chance of having suicidal thoughts is higher than for someone who has a loss of interest, but no depression. Even though the simulated networks are incomplete, the results do provide evidence for the possibility that a network model can reproduce empirical data.

Discussion

The "missing heritability" indicates that the individual differences in vulnerability to develop a disorder are for the most part genetically determined. However, only a small part of the genetic variance can be traced to identified gene changes.

The strength of the symptom connections in a network can partly be explained genetically, but it is probably not the case that all connections in a disorder are influenced by the same genes in all people.

Access: 
Public
Article summary with The New Person-Specific Paradigm in Psychology by Molenaar & Campbell - 2009

Article summary with The New Person-Specific Paradigm in Psychology by Molenaar & Campbell - 2009

In psychology, data has been collected for years using methods that focus on variation between test subjects. This variation is called inter-individual variation. Statistical measures (such as the average and standard deviation) are calculated using the inter-individual variation. All statistical data is extracted from the population and all types of research look at interindividual variation. It does not matter whether longitudinal or cross-cultural research is being carried out, the analyzes that are used focus on interindividual variation.

According to some researchers, this is sometimes a downside. When the findings obtained from a total population are applied to the individual, the level of interindividual variation shifts to the level of intraindividual variation. With the help of ergodic theories one can find under which circumstances findings based on interindividual variation can be applied to individuals. The conditions are very strict and it will therefore not often happen that the findings can be shifted from the inter-individual to the intra-individual level.

Conditions for ergodicity

Ergodicity is about when (under what conditions) analyzes of interindividual variation yield the same results as analyzes of intraindividual variation. To be able to do this, people must be seen as a dynamic system consisting of behavior, emotional, cognitive and other psychological processes that change over time.

According to Cattell, all these different psychological dimensions can be measured with time as a dimension, psychological variables as a different dimension and different test subjects as the third dimension. This is called Cattell’s data box. An analysis of the intraindividual variation is called the P technique and the analysis of the interindividual variation is called the R technique. These two (and therefore the condition for ergodicity) can only be equal if there is homogeneity in the population and if the data is stationary.

Homogeneity

Every test subject must act according to the same statistical model if homogeneity is to be achieved. An IQ test can measure general intelligence in one person, while it measures verbal intelligence in another person. In this case there is no homogeneity. If the strength for items differs from person to person, then there is no homogeneity either.

Stationary

The other condition for ergodicity is stationary. This means that statistical characteristics remain the same over time. This cannot happen if development processes are involved. The IQ of a four-year-old changes as he or she gets older and this is therefore not a stationary statistical characteristic (and therefore ergodicity is not met).

The writers of this article believe that psychological processes differ from person to person and that, therefore, one should actually look at the intra-individual variation if one wants to study these processes. This also has the advantage that these processes can be guided to an optimal level per person. According to them, doctors should look more at the person when they prescribe a certain dose of a certain medicine and less at the average of people (the dose that is usually written out).

The intra-individual approach is fairly new and the writers should pay more attention to it. According to the writers, this approach should actually be necessary.

Access: 
Public
Article summary with Depressive Disorders and Interpersonal Processes by Segrin - 2010

Article summary with Depressive Disorders and Interpersonal Processes by Segrin - 2010

Introduction

Depression affects three interpersonal processes, namely problems with social skills, reactions of others to depression and dysfunctional family relationships.

Social skills deficits and depressions

According to Lewinsohn behavioral therapy, people with depression have reduced social skills, which means they miss out on positive reinforcers from the environment. This will make the depression worse. Interpersonal reactions to someone else's depressive behavior are sympathetic at first, but soon rejective. Poor social skills are therefore a cause, but also a consequence of depression.

Social skills of people with depression are often scored through a self-report. Evidence has been found by Segrin that people with depression score their own social skills more negatively than others would. However, a study by Lewinsohn indicated that people with depression look at their own skills more realistically than non-depressed people. Non-depressed people would overestimate their own social skills. This pattern is called the depressive realism effect.

It remains therefore unclear whether a self-report is reliable enough to determine the social skills of a depressed person. Nevertheless, in the majority of cases, depressed people score their own social skills lower than non-depressed people do.

Social skills of people with depression can also be determined by the friends of those with depression. People with depression have lower social skills in their social interactions than people without depression. So apparently there is something in the social behavior of people with depression that stands out among their friends, resulting in a lower assessment of the social skills of those with depression.

Eye contact

People with depression make less eye contact with their conversation partners than non-depressed people do. The facial expressions of people with depression are less spontaneous and lively than people without depression.

Facial expressions

Depressed people have more control over their facial expressions and suppress a smile easier. This makes it more difficult for the other person to evaluate the facial expression of the depression. Decreased social skills are not only visible through problems with appropriate and effective implementation of social behavior, but also through understanding the behavior of others. Depressed people have a negative bias in which they see neutral facial expressions of others as negative and have more difficulty recognizing a happy face. People with depression therefore have difficulty looking happy and recognizing a happy face.

Posture

People with depression nod less during a conversation. They also use fewer hand gestures while telling a story.

Language

People with depression talk slower, softer, less, have more and longer pauses and respond more slowly to the behavior of others. They are more negative towards acquaintances than to strangers.

According to Lewinsohn's behavioral theory about depression, impaired social skills are a cause of depression. There is conflicting evidence for this. An alternative possibility is that poor social skills are the result of depression. The scar hypothesis states that people have noticed during a depressive period that they had reduced social skills, so that these reduced social skills increase the risk of a new depression in the future. A third possibility for the relationship between poor social skills and depression is that poor social skills contribute to the cause of depression or are a type of vulnerability factor in the development of depression.

Those who have poor social skills and stressful environmental experiences have an increased chance of getting a depression. Having poor social skills makes people vulnerable to stressful events.

In line with the social skills deficit hypothesis, most people have problems with social skills when they are in depression. Poor social skills can be seen as a cause of depression, as a consequence of depression and as a vulnerability factor for developing depression.

Interpersonal responses to depression

Coyne's interpersonal interaction model is based on the influence of reactions of others on people with depression. Depressive behavior causes a negative reaction in others. In the beginning people will show sympathy, but after a while this will turn into irritation. This takes care of the depressed person and sees this as rejection. This rejection maintains the depression.

Excessive Reassurance Seeking

Coyne has observed that a lot of communication from a depressed person is aimed at confirming the relationship with the other. The search for reassurance often occurs with depressed people. This is irritating to the others and therefore provokes rejection. People with depression are so often looking for confirmation because they have low self-esteem and have experienced many negative things. According to the circumplex model, interpersonal behavior elicits a certain expected response from others. If someone does not respond according to this expectation, this causes a negative affect for the other. If the partner of the person who is depressed gives the depressed person attention and confirmation, the partner expects the depressed person to do something with this. If the depressed person responds by seeking confirmation, the partner's motivation runs out to help and support the depressed person even longer.

Emotional Contagion

The group of friends of a depression is often more irritated than people without a depressive acquaintance.

Interpersonal Rejection of People with Depression

People with depression are more likely to be rejected by others. Men in particular are rejected. Strangers are rejected rather than acquaintances. According to the self-verification theory, people are motivated to maintain their self-image in order to increase their sense of control. For someone who is depressed this can be seen in the fact that people with depression mainly pay attention to negative feedback from others, because this info confirms their self-image. This ensures the maintenance of the depression.

Family relations and depression

Certain interactions within a family will increase the risk factors for getting a depression. Emotional abuse in the family creates a fear of criticism and rejection. Rose and Abramson (1992) stated that emotional abuse is very harmful to the child's self-image. If this self-image is repeatedly brought down by the parent, it will lead to a vulnerability to the development of depression. In the case of physical and sexual abuse, this association also exists, but it is less strong, because the child's self-image is not directly brought down by the parent.

There is a strong mutual relationship between depression and marital problems. Spouses with a depressed partner have a lot of problems with this and also have a greater chance of getting a depression. The interaction between men and women is often negative, conflicts are handled poorly. Another factor that reinforces the relationship between depression and marital problems is the fact that young people with depression marry too quickly. They see marriage as a solution for their depressive feelings.

Women are more sensitive than men to the symptoms of depression, leading to a worse marriage. A bad marriage leads to a worsening of depressive feelings among men and women. Having depression causes many problems with upbringing. As a result, children with a depressed father / mother have a greater chance of depression.

Conclusion

People are social animals. Social relationships play a major role in emotion regulation. If something goes wrong within such a relationship, it has a major negative impact on human emotions. The other way around also works. If someone is depressed, this will have a negative impact on interpersonal relationships. There are two interpersonal processes that can increase the risk of developing depression. First, it is having poor social skills. Secondly, having poor self-esteem through upbringing with abuse.

Access: 
Public
Access: 
Public

Image

Click & Go to more related summaries or chapters:

Psychopathology and psychological disorders: the best scientific articles summarized

Image

 

 

Contributions: posts

Help other WorldSupporters with additions, improvements and tips

Add new contribution

CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Image CAPTCHA
Enter the characters shown in the image.

Image

Spotlight: topics

Check the related and most recent topics and summaries:
Activities abroad, study fields and working areas:

Image

Check how to use summaries on WorldSupporter.org

Online access to all summaries, study notes en practice exams

How and why use WorldSupporter.org for your summaries and study assistance?

  • For free use of many of the summaries and study aids provided or collected by your fellow students.
  • For free use of many of the lecture and study group notes, exam questions and practice questions.
  • For use of all exclusive summaries and study assistance for those who are member with JoHo WorldSupporter with online access
  • For compiling your own materials and contributions with relevant study help
  • For sharing and finding relevant and interesting summaries, documents, notes, blogs, tips, videos, discussions, activities, recipes, side jobs and more.

Using and finding summaries, notes and practice exams on JoHo WorldSupporter

There are several ways to navigate the large amount of summaries, study notes en practice exams on JoHo WorldSupporter.

  1. Use the summaries home pages for your study or field of study
  2. Use the check and search pages for summaries and study aids by field of study, subject or faculty
  3. Use and follow your (study) organization
    • by using your own student organization as a starting point, and continuing to follow it, easily discover which study materials are relevant to you
    • this option is only available through partner organizations
  4. Check or follow authors or other WorldSupporters
  5. Use the menu above each page to go to the main theme pages for summaries
    • Theme pages can be found for international studies as well as Dutch studies

Do you want to share your summaries with JoHo WorldSupporter and its visitors?

Quicklinks to fields of study for summaries and study assistance

Main summaries home pages:

Main study fields:

Main study fields NL:

Follow the author: Psychology Supporter
Work for WorldSupporter

Image

JoHo can really use your help!  Check out the various student jobs here that match your studies, improve your competencies, strengthen your CV and contribute to a more tolerant world

Working for JoHo as a student in Leyden

Parttime werken voor JoHo

Statistics
2106 1