ISPS-US's Advocacy Committee penned the following letter in response to SAMHSA's recent Request for Information; Potential Changes to its Evidence-Based Practices Resource Center. In it, we conveyed our concerns about the re-evaluation of the Evidence-Based Practices Resource Center (EBPRC), emphasizing the need for a more inclusive, transparent approach to evidence-based practices. We highlighted biases in current treatments, emphasizing the need for transparency, the inclusion of qualitative measures, and a focus on long-term research for effective mental health treatment.
We urged SAMHSA to consider the limitations of manualized treatments and to prioritize the relationships between clinicians and clients in the pursuit of well-being-focused interventions. Furthermore, we emphasized the importance of a more inclusive approach to soliciting feedback from various stakeholders in the mental health field.
Read the full letter below:
Dear SAMHSA Administrators,
ISPS-US is the United States Chapter of the International Society for Psychological and Social Approaches to Psychosis. Our organization is unique in its inclusion of researchers, clinicians, individuals with lived experience, family members, and other stakeholders, all of whom have joined together to create a three-dimensional picture of what works in the treatment of people diagnosed with severe mental illness. We appreciate SAMHSA’s invitation to share our knowledge and experience, and do so urgently to address our significant concerns regarding your agency’s efforts to re-evaluate some aspects of its Evidence-Based Practices Resource Center (EBPRC) in order to better comply with Sec. 7002, subsections (b), (c), or (d) of the 21st Century Cures Act. We wish to speak to the general questions regarding which you’ve solicited comments and selected specific sub-questions you have posed.
Prior to addressing the specific issues SAMSHA has raised regarding the EBPRC, however, ISPS-US expresses concern that the public is being asked to suggest small enhancements to a piece of legislation that, despite some valuable provisions contained therein, was faulty from the start. Although the 21st Century Cures Act was touted, for instance, as offering a way to divert struggling individuals from incarceration, and as a violence prevention measure that could intercept unstable individuals early in their disease trajectories, it must be recognized that there was significant skepticism even at the time of its passage regarding the likelihood that it would succeed in producing the outcomes for which it aimed. Some of the objections were from medical providers and organizations who expressed concern that the Act fundamentally weakened the FDA’s capacity to uphold established standards for efficacy and safety.
But the law’s effect on mental health treatment was, in many ways, just as far-reaching and problematic. Much of the funding the bill mandated was appropriated directly to NIH/NIMH, the latter of which has consistently channeled its dollars for research and implementation towards treatments that prioritize biological and behavioral interventions rather than the relationship-based, community-embedded and consensual psychotherapies that have shown a century of consistent results, but which require more – not less – funding to study. In short, the scales have already been tipped in the favor of circumscribed, short-term, symptom-focused (as opposed to well-being-focused) interventions that are easy to research and implement but which often produce circumscribed, fleeting, non-replicable or trivial results, and which are easily corrupted by special interests. Further, critics have highlighted the support offered by the Act for coercive treatment methods and increased funding for hospital beds rather than community-based programs. It is no wonder that insurance and pharmaceutical companies applauded its passage
Question A: How can SAMHSA improve the EBPRC to better meet the needs of the behavioral health field?
SAMHSA can begin its improvement effort by emphasizing up front that it is a registry and not an endorsement of the listed treatments, and certainly not a guideline that insurance companies can use to ration care. The public must know that this is a list of treatments for which there has been financial support for research, and not a guide to recommended interventions, and practitioners should know that this registry should not be used as an interventional tool. Suggesting that practitioners could find the registry useful as a clinical roadmap is merely to point to inadequacies in clinical training, which, one would hope, could be strengthened through increased financial support, especially for trainees from marginalized groups.
The public should also be educated on the difference between efficacy and effectiveness. Data collected from laboratory studies necessitating controlled conditions rarely directly yield outcomes in the “real world” of client experience, and often rule out just the sorts of confounding variables that, taken together, have constituted the core of most individuals’ distress. Further, given that this registry reflects merely one leg of the “three-legged stool” of intervention, privileging research outcomes at the expense of clinical wisdom and client preferences, the public should be informed clearly that the two latter factors, which are almost always the most decisive, have been artificially removed from the findings offered in SAMHSA’s listing.
The public should also be educated on the unreliability of much psychological research. We are sure that you are aware of the replication crisis that has been acknowledged within the field of psychology, in which the psychological research that has been published is, as often as not, neither replicable nor reproducible (1). The funding of these studies has siphoned research dollars from the study of more intensive, complex and open-ended treatments that have, nevertheless, been lifesaving for many and that are, in the end, less expensive to implement than the coercive, short-term, medication-focused and non-relational programs that currently predominantly make up the scaffold of a mental health treatment system that is failing (2).
Further, the registry could be strengthened by clarifying SAMHSA's criteria when deciding whether to include a given intervention. This would include offering complete transparency regarding conflicts of interest in the formulation of a given treatment, the funding sources of the studies cited, the research presented to document its effectiveness, the data collected, including adverse outcomes and drop-out data, and the publication process. Without this transparency, anyone consulting this registry would be at pains to understand, for instance, why longer-term so-called talk therapies whose efficacy has been documented in literally hundreds of RCT, case, narrative, and naturalistic studies were never included in the original registry, whereas other interventions with a scant research base consisting of only a few short-term and poorly-designed studies appeared there. In short, SAMHSA should be attuned to how its established process for adjudicating inclusion has already skewed its mission, resulting in creating a roster of short-term, easily-operationalizable, well-funded programs rather than a roster of effective ones.
Even more concerning are the opportunities for the EBPRC to reinforce fundamental inequities in the mental health system because if research funding for psychotherapy and other effective psychotherapeutic interventions is lacking, still less has there been significant financial support for funding treatments that work for people with the most severe diagnoses. Although there are research studies that document the effectiveness of Open Dialogue and psychoanalysis for the treatment of psychosis, for instance, these have mostly been done in countries other than the United States, as domestically, there is almost no support for research on psychosocial approaches to psychosis. Further, effective and innovative residential programs alternative to treatment as usual in psychiatric hospitals, such as Soteria Houses, attract even less research support due to the expense of these programs, which, while significant, pales in comparison to the expense of maintaining individuals as revolving-door-inpatients or in carceral settings. Thus, the bias in research funding that such a registry reflects will likely only be amplified when it comes to intensive treatment programs for those with diagnoses on the psychosis spectrum.
Question B: What strategies should the EBPRC use to ensure its content is high-quality and supported by strong evidence?
The most critical factor in ensuring the high-quality content of the EBPRC would be its willingness to explore the significant limits of manualized treatments per se. There is plentiful evidence to suggest that the choice between manualized models is less impactful than the quality of the relationship between clinician and client, which takes time to develop, and which, as multi-modal evidence suggests, proves most significant in the long term, in terms of both overall client well-being and cost-effectiveness. We could add that the most robust evaluation models triangulate the kinds of evidence considered, including both qualitative and quantitative measures, and consider complex sets of experiential variables that represent the real-world experiences of those diagnosed with mental illness.
At the very least, the EBPRC should reflect standards already in place in the medical arena. We highlight the importance of adherence to the Open Science protocol, by which research questions and methods are published in advance, outcomes are recognized as valid only if they adhere to stated protocols (3), data are publicly available, and replicability is established (4). Negative results must be published, and slippages between the portrayal of results and the actual findings – for instance, when abstracts distort or flatly contradict study data and conclusions – should be highlighted, with clear consequences. It should go without saying that ghost-written articles by individuals, pharmaceutical companies, and product developers should be ruled out in all cases.
Question C: How can SAMHSA expand the reach of the EBPRC?
Most important in any effort to expand the EBPRC’s reach would be a broadening of the types of interventions and the definition of evidence to include those experiential values that often signify enduring cure rather than a more fleeting reduction of specific symptoms but which tends to be discounted in a climate in which “evidence-based” has generally come to refer to “varieties of talk therapy that are relatively non-directive, time-intensive, in-depth, and exploratory in nature—typically under the psychodynamic and humanistic umbrellas of care" (5.) This would require the assembly of a team of administrators who are proficient in qualitative and mixed-method research. Until such research is included in the EBPRC, there will be inherent bias in the effort, no matter which controls are implemented. Further, there should be a concerted effort to fund long-term research reflecting more complex outcome measures that encompass personal well-being, the strengthening of relationships, and the development of meaningful work engagement. It would be especially crucial to focus funding to evaluate the success of these treatments in meeting the needs of those diagnosed with severe and persistent mental illness.
Question D: How can SAMHSA solicit feedback on the use of its resources and information?
We appreciate SAMHSA’s solicitation of public comment, but would hope that its requests for input be circulated to all facets of the mental health system, including professional organizations, government entities, peer support organizations, researcher organizations broadly construed, and organizations for consumers of mental health services. We also assert that an outreach effort to consumers and families must go beyond the inclusion of representatives of a few large organizations such as NAMI and MHA, which themselves have financial ties to pharmaceutical and insurance entities.
Domain 1. Adding a program review and rating component to the EBPRC.
Question 1a: Please describe the extent to which a new EBPRC component that reviews the evidence for a manualized intervention/program and publicly posts the results would be of use to the behavioral health field.
This question is skewed towards the expectation that such a registry would, in fact, be of use. We request that attention be paid, as well, to how a new EBPRC could be a detriment to the field. In emphasizing less-expensive-to-implement, manualized short-term protocols, a new EBPRC stands to erode appreciation of the essential factors of individual difference, clinical wisdom, inequities in service distribution, cultural sensitivity, ethical values and social forces of stigmatization in its implied equation of quantifiable results and meaningful treatment outcomes. Further, practitioners often claim that such protocols lead to clinician burnout, as they restrict the use of interpersonal clinical capacities and create depersonalized work environments, contributing to rapid turnover precisely in the programs that are designed to meet the ongoing relational needs of the most acutely struggling individuals.
Domain 3. Culturally informed and community-driven programs and practices.
Question 3a: In what ways, if any, would an EBPRC component that assesses and shares findings from research on community-based and/or culturally driven behavioral health programs be of use to the behavioral health field?
Interventions that describe themselves as “culturally driven” often consist of manualized protocols that are then normed for individuals representing a variety of races and genders rather than reflecting a foundation in cultural meaning-systems at the level of program development. Given the complexities of cultural meaning-systems, as well as the complexities that are added in individuals’ processes of interpreting and incorporating these meaning-systems, we suggest that intervention be grounded in cultural immersion as well as extensive listening to the sensibilities of clients and communities. This cannot sufficiently be done within the scope of a manualized protocol, which mandates a focus on one-size-fits-all diagnostic rubrics, interventions, and outcome measures.
SAMHSA itself recognizes these difficulties in its RFI when it comments that “for several reasons, many community- and culturally-based programs are excluded from evidence-based registries or clearinghouses. How these programs are developed and implemented means that they cannot ethically or logistically be evaluated using randomized controlled trials or quasi-experimental designs that registries require for consideration. The programs can also be small in scale and geographically specific, making it even more difficult to randomly select participants or develop matching control groups.” We appreciate this thoughtful assessment of the dilemma of “evidence-based treatment” but encourage a re-evaluation of the project in its light, recognizing the need for either a significant expansion of that term with a recognition of the problematic effects of its prior, more circumscribed application or a re-allocation of attention and resources to efforts towards ongoing community listening as an open-ended and ongoing process.
We appreciate the opportunity to respond to this request for information to assist SAMHSA to satisfy the 21st Century Cures Act and to amend the Evidence-Based Practices Resource Center, accordingly.
1. Yong, E. (2018). Psychology’s replication crisis is running out of excuses. The Atlantic, 19 Nov 2018. https://www.theatlantic.com/science/archive/2018/11/psychologys-replication-crisis-real/576223/?utm_source=copy-link&utm_medium=social&utm_campaign=share
2. Lazar, S. (2010). Psychotherapy Is Worth It: A Comprehensive Review of its Cost-effectiveness. Lansing: American Psychiatric Publishing, Inc.
3. E.g., Xanax study showed less panic attacks than on placebo after 4 weeks on Xanax, and showed more panic attacks on Xanax than on placebo after 14 weeks. So, the 4-week result was published and the 14-week one hidden.
4. When Pigott's team had to file FOI to get data for STAR*D, or Healy's team used a unique court decision to get data for Study 329