Making Psychiatric Assessments in Emergency Rooms

Summary

Project started on Nov. 1, 1993 (Completed)


The New York Office of Mental Health partnered with CTG to improve emergency psychiatric decision making. The primary goals of the project were to reduce inappropriate admissions and discharges, improve client and system outcomes, and reduce inconsistencies in emergency room decisions. To achieve these goals, the project developed decision support software and sought to apply this improved technology in a very harried, complex, and significant decision environment - an environment that deprives individuals of their liberty and consumes significant government resources.

The decision support software was designed to ensure that physicians ask all the appropriate questions needed to make an admission decision, and to help them sort and weigh the relative importance of the answers. The admission decision, however, is the province of the physician. The software was not intended to, nor can it, replace physician judgment.

Expectations for the CTG project were ambitious. OMH hoped ultimately for a psychiatric assessment product that could be sent to the 166 hospitals in New York for potential use in their emergency rooms. The complexity of the tool and the policies it represents, however, point to the need for much additional testing and revision of the prototype.


Scope of Work

An inappropriate decision to admit or discharge a psychiatric patient from an emergency room is often the starting point for a series of undesirable results. The individual inappropriately admitted is deprived of liberty and involved in a disruptive, stigmatizing event. Since each admission to inpatient psychiatric care costs the mental health system about $10,000, these admissions are also a costly misuse of scarce health care dollars. The individual who is inappropriately discharged does not receive the care he or she needs, and, in extreme cases, may engage in dangerous or violent behavior in the community. Over 135,000 emergency psychiatric assessments are conducted in New York's hospitals each year, and research shows wide variability in the resulting admission and discharge decisions. 

The project the Office of Mental Health (OMH) proposed to the Center for Technology in Government (CTG) was designed to address this issue through the development of a computer-assisted decision model to support psychiatric assessments in emergency rooms. The decision support model and software developed are not meant to replace the physician or the physician's own expert judgment. Instead, they support the practitioner in gathering and considering all information relevant to an admissions decision.

The first objective of the CTG project was the development of a formal decision model for use in psychiatric assessments in emergency rooms. An effective and workable model required not only careful study of relevant research, but also required a consensus among experts. The Center for Policy Research at the University at Albany applied its extensive facilitation experience to this project objective. A group of 15 experts was brought together three times (in May, July, and September 1994) to define and reach consensus on the decision model. A selected group of three experts met a fourth time, in April 1995, to evaluate the readiness of the prototype for a field test.

The decision model produced by this first phase of the project specified the most important information needed for a psychiatric assessment and a method for combining individual items of information to provide guidance for a disposition decision. This model specified relationships among various items of information to key summary indicators or modules.

They are: 

  • Danger to Self Environmental Factors
  • Danger to Others
  • Client/Family Preferences
  • Mental Health Status
  • Availability of Services
  • Functional Impairment
  • Medical Conditions
  • Substance Abuse
  • Potential to Benefit from Treatment

The second objective was to create software for the decision model. This prototype system is a Microsoft Windows-based program that runs on a notebook computer and generates descriptive profiles for a client in each of the modules embodied in the model. These profiles are generated from users' answers to a set of approximately seventy-five questions about the client. Not all questions need be answered about a client; the system is capable of handling situations in which only partial information is available. The prototype system has two components: the user interface and the evaluation module. The user interface allows the clinician to supply information about the client. The evaluation module presents summary output in three formats: a scaled scoring, a graphical representation of those scores, and a short narrative analysis.

The third objective was to field test the software. A field evaluation comprised a first assessment of the structure, value, and usability of the tool in a hospital setting in one psychiatric ER. Evaluation results cover three topics: strengths and weaknesses of the prototype, its potential for future use, and general reaction to the technical aspects of the application.


Press Releases & News Stories

Press Release


Publications & Results

Reports

Lessons Learned

The New York Office of Mental Health sought a project at CTG in order to improve emergency psychiatric decision making. The primary goals of the project were to reduce inappropriate admissions and discharges, improve client and system outcomes, and reduce inconsistencies in emergency room decisions. To achieve these goals, the project developed prototype decision support software and sought to apply this improved technology in a very harried, complex, and significant decision environment--an environment that deprives individuals of their liberty and consumes significant government resources.

The prototype decision support software was designed to assure that physicians ask all the appropriate questions needed to make an admission decision, and to help them sort and weigh the relative importance of the answers. The admission decision, however, is the province of the physician. The software was not intended to, nor can it, replace physician judgment.

Expectations for the CTG project were ambitious. OMH hoped ultimately for a psychiatric assessment product that could be sent to the 166 hospitals in New York for potential use in their emergency rooms. The complexity of the tool and the policies it represents, however, point to the need for much additional testing and revision of the prototype. Nevertheless, while still short of readiness for implementation, the project made much progress toward this ultimate goal. Specifically, OMH achieved:

  • A better understanding of the possibilities and limitations of technology use within the emergency room. 
    The project enhanced understanding of the emergency room process, including how it differs from setting to setting. This wide variability among emergency rooms highlighted the need for ER assessment protocols to improve consistency across these settings. In addition, much was learned about the possibilities and limitations of computer software in a psychiatric emergency environment. Very important to OMH, the project demonstrated the feasibility of software use by physicians. This knowledge has important value as a guide for further efforts to improve emergency psychiatry, which may include the use of information technology.

  • Significant progress toward a decision tool for use in emergency psychiatric assessment. 
    Consensus was achieved by a national expert panel on the basic structure of an instrument. Agreement was reached concerning the major areas (modules) to be assessed (such as danger to self) and the core questions within each area. The instrument included areas identified as important by consumers of mental health services and their families. Further, important headway was made in developing consensus about which areas and which questions within areas are related to other modules. For example, responses to questions about drug use and the presence of environmental stressors were recognized as ingredients in the danger to self module score. Although both of these areas need further refinement, they form a good foundation for additional development. Progress was also achieved in establishing the relative importance that should be associated with answers to various questions. Finally, the project evaluation outlined the important next steps for OMH to take in order to finalize the software. This instrument has great value to OMH. It is a product that is ready for further field testing and eventual use as a training device and/or a decision aid in ER settings.

  • Use of an expert panel to achieve both consensus and legitimacy. 
    The dialog between emergency room practitioners and consumers of emergency psychiatric services and their family members allowed the airing of various and divergent perspectives in the expert panel meetings. These discussions enhanced understanding and empathy among the panel members and helped move the group toward consensus. Moreover, since the decision model was defined by a group of experts including practitioners, consumers, and officers of the American Association of Emergency Psychiatry, it has a level of authority and legitimacy well beyond what any one stakeholder could achieve alone. This group also represents a ready panel to be drawn upon in future work.

  • A basis for future agency-university collaboration. 
    The project strengthened and initiated important working relationships between OMH and the University at Albany which are likely to lead to collaborative future work.

Value to State and Local Government

Many public agencies are responsible for programs that try to meet the needs of a diverse set of stakeholders. This project illustrates some ways to address that diversity and to seek consensus on both policies and actions. It also gives some guidance on the value of information technology as a way to bring needed expertise to decision situations.

  • The use of expert judgment panels is an effective way to identify differences, build credibility, and work toward consensus about complex issues
    While it is feasible to develop a decision model based on the judgment of a single expert, a decision model that would be acceptable in emergency rooms across the state and would be useful both to specialists and generalists required a consensus among experts from various fields. Achieving a consensus under these circumstances was not a trivial matter. Group decision support techniques developed at the University at Albany and elsewhere were used in this project to facilitate consensus among a diverse groups of experts. Although this technology has been applied successfully in a variety of fields, this project is the first time it has been used in psychiatric assessments.

    The method works because it focuses the panelists' attention on the task, makes reasons for disagreement that are usually difficult to uncover explicit, separates false disagreement from real disagreement, and gives the participants the tools necessary to overcome some of the limitations that prevent agreement.

  • Prototyping encourages stakeholders to confront issues and make explicit choices. 
    During the design stage, the most important outcome of the prototype was the way it forced the panel to engage basic issues that had been previously avoided or treated superficially. These issues were (1) what is the specific purpose of the tool? and (2) in what part of the intake process is it to be used? Before the prototype was presented, the panel had discussed but did not settle on one of several possible uses for the tool: an aid in conducting an interview, an information recording tool to be used following the patient interview, or a training device. Neither had they clarified the characteristics of the user for which the tool is to be designed. The examination of the prototype brought these issues back to the surface, which led to more clarification and clearer direction for further development.

    Once the prototype was taken to the field, these issues became even more obvious. The physicians who tested the system were able to make very precise comments about which features worked well and which did not and they were able to recommend changes that were far more specific than would be possible in any other circumstances.

    This clarifying effect of prototyping appears to be an antidote to the common tendency to avoid making unambiguous decisions, especially in a group with conflicting interests and perspectives. Conflict within the group is avoided by either passing over tough issues, or dealing with them in overly general or superficial ways. A prototype will necessarily embody decisions on these issues. Therefore, confronting these decisions in the prototype forces the group and the user to deal with the implications and consequences of one choice or another. This can lead, in turn, to a more realistic and focused discussion of the issues, and clearer, more detailed specifications for a full system.

  • Policy advisors can play a useful framing role in software design. 
    This project rested on contested policies as well as presumably problematic practices. The expert group convened to design the decision model was not a typical software design team. It embodied competing perspectives on the underlying policy problems of emergency mental health services. In this case, the data set to be collected by the system represents a group policy about what is important in or required by an emergency psychiatric assessment.

    The public policy principles of openness, participation, and legitimacy are critical to the eventual acceptance of a decision support tool for emergency psychiatric assessments. Groups like the expert panel assembled for this project are often consulted for policy advice and they lend accountability, legitimacy, and political and substantive credibility to public deliberations and decision making.

  • The use of expert panels and consensus-driven models to design unstructured software applications is a process that still needs refinement.
    In most cases, a certain amount of vagueness in a policy statement is acceptable and sometimes even desirable. In this project, however, it produced an ambiguous model whose residual vagueness about purpose and intended user resulted in clear weaknesses in the software application.

    An alternative approach would have been to use the diverse expert panel to first create a policy framework, which would set the boundaries for a more traditional system development phase. In this first phase, the expert panel would define the purpose, the user, the categories of necessary data, and the expected results of the system. This expert consensus about the key factors in the ER decision-making process would then have guided a small group of system designers and actual users to create and test a prototype which reflects both the panel consensus on policies and the practical complexities of a working ER. The design team would be responsible for detailed specifications for how the system works and how a user interacts with it in the context of a real life setting.

    The prototype could then have been presented to the expert panel for further review and assessment of how well the prototype performs against their policy framework. Panel reactions and recommendations would become specifications for refinements in later versions of the prototype. The several iterations that would be necessary between the expert panel and the more traditional software design team would probably not have taken more time, would have relied more on the specific strengths of each group, and might have produced a more refined product for the field test.


Partners

Lead Partners

  • New York State Office of Mental Health

Corporate Partners

  • Borland International
  • Digital Equipment Corporation, now known as Hewlett-Packard
  • IBM Corporation
  • Microsoft Corporation

Academic Partners

  • Jeryl Mumpower, Director, Center for Policy Research, Rockefeller College of Public Affairs and Policy, University at Albany, SUNY
  • Thomas Stewart, Director for Research, Center for Policy Research, Rockefeller College of Public Affairs and Policy, University at Albany, SUNY

Center for Technology in Government

  • Peter Bloniarz, Laboratory Director
  • Anthony Cresswell, School of Education
  • Sharon Dawes, Director
  • Ann DiCaterino, Manager Project Support
  • Winsome Herard, Assistant Project Coordinator
  • Mark Nelson, Graduate Assistant, Information Science Ph.D. Program
  • Francis T. Nolan, Project Coordinator
  • Sandor Schuman, Research Associate

Participants

Expert Panel

  • Michael Allen, MD, Bellevue Hospital Center, NYC
  • Gail Barton, MD, VA Hospital, White River Junction, VT
  • Mebbie Breadley, RN, St. Joseph's Hospital, Elmira
  • Anne Marie Brown, Mobile Crisis Service, NYC
  • Gladys Egri, MD, Erie County Medical Center
  • Peter Forster, MD, American Assn. for Emergency Psychiatry
  • Celeste Johns, MD, Imogene Bassett Hospital, Cooperstown
  • Ann Krauss, Consumer Advocate
  • Lawrence Levy, MD, Westchester County Medical Center
  • Patricia McDonnell, OMH Quality Assurance
  • JoAnn Piazzi, Westchester Independent Living Center
  • Frank Rabert, Crisis Service, Oneonta
  • Vera Hassner Sharav, Consumer Advocate
  • Bruce Schwartz, MD, Montefiore Medical Center, NY
  • Don Thoms, St. Vincent's Medical Center, Staten Island

Advisory Committee

  • Lorna Avila, St. Vincent's of North Richmond
  • Yogesh Backhai, Erie County Medical Center
  • Andrea Blanch, Community Support Program, OMH
  • LuRay Brown, NYC Health & Hospital Corp.
  • Neal Cohen, Mount Sinai Medical Center
  • Gladys Egri, Erie County Medical Center
  • Susan Erway, Ambulatory Care CSI, OMH
  • Sandra Forquer, Quality Assurance, OMH
  • Frederick Heigel, NYS Health Department
  • Barbara Heyne, Counsel, OMH
  • Celeste Johns, Imogene Bassett Hospital
  • Lawrence Levy, Westchester County Medical Center
  • Elizabeth Lobo, Greater Hospital Assoc. of NY
  • Patricia McDonnell, Data/Incident Management, OMH
  • Luis Marcos, NYC Mental Health
  • Russell Massaro, Clinical Support, OMH
  • Susan Moore, HANYS
  • Lucille Okoshkin, Alliance of the Mentally Ill
  • John Oldham, Commissioner's Office, OMH
  • JoAnn Piazzi, Westchester Independent Living Center
  • Leonard Reich, Health Insurance Plan of Greater NY
  • Steven Scher, NYC Regional Office, OMH
  • Ellen Stevenson, Columbia Presbyterian Hospital
  • Fran Teeter, NYS Dept. of Social Services
  • Don Thoms, St. Vincent of North Richmond
  • Manuel Trujillo, Bellevue Hospital Center
  • William Tucker, Psychiatric Services, OMH
  • Carol VonKlober, NYS Dept. of Social Services