Development of a Measure to Assess the Implementation of Children's ...

4 downloads 15764 Views 176KB Size Report
Florida, 13301 Bruce B. Down Blvd, Tampa, FL 33612-3708, USA. E-mail: .... manifested systems of care principles and that communities generally scored highest on the factors related to .... visits, mental health workers in the schools) that.
Development of a Measure to Assess the Implementation of Children’s Systems of Care: The Systems of Care Implementation Survey (SOCIS) Roger A. Boothroyd, PhD Paul E. Greenbaum, PhD Wei Wang, PhD Krista Kutash, PhD Robert M. Friedman, PhD Abstract The children’s system of care framework has been extensively implemented in the U.S. Since its inception in 1993, the Comprehensive Community Mental Health Services for Children and Their Families Program has invested in excess of $1 billion supporting the development of systems of care in 164 grantee sites across the country. Despite these efforts to implement children’s systems of care nationally, little is known about the extent to which the principles and values actually have been put into practice outside of the funded grantee sites. This paper describes the development of the Systems of Care Implementation Survey, a measure designed specifically for the first ever study assessing the level of implementation of factors contributing to effective children’s systems of care in a nationally representative sample of counties throughout the U.S.

Introduction Children’s systems of care For nearly a half century, advocates have decried the need to reform the children’s mental health system. Federal reports documented the need for system change.1,2 In the past decade, there have Address correspondence to Roger A. Boothroyd, PhD, Department of Mental Health Law and Policy, University of South Florida, 13301 Bruce B. Down Blvd, Tampa, FL 33612-3708, USA. E-mail: [email protected]. Paul E. Greenbaum, PhD, Department of Epidemiology and Biostatistics, University of South Florida, Tampa, FL, USA. Wei Wang, PhD, Department of Epidemiology and Biostatistics, University of South Florida, Tampa, FL, USA. Krista Kutash, PhD, Department of Epidemiology and Biostatistics, University of South Florida, Tampa, FL, USA. Robert M. Friedman, PhD, Department of Epidemiology and Biostatistics, University of South Florida, Tampa, FL, USA. Journal of Behavioral Health Services & Research, 2011. Healthcare.

288

)c 2011 National Council for Community Behavioral

The Journal of Behavioral Health Services & Research

38:3

July 2011

been renewed calls for reforms in the children’s mental health services, including a conference sponsored by the Surgeon General which resulted in a “national action agenda.”3,4 which examined mental health services for both children and adults. There were many similarities among the concerns identified in these reports which noted that: (1) many children in need of care were not getting mental health services, (2) children served were often served in excessively restrictive settings, (3) service arrays were limited and community-based options were generally lacking, (4) the entities responsible for serving children with mental health needs seldom worked together, (5) families often were blamed and were not included as partners in their children’s care,5 (6) childserving agencies rarely considered cultural differences in the provision of care,6,7 and (7) inadequate resources persist and result in insufficient service capacity to meet the needs of these children.8 The origins of the children’s systems of care movement emerged from the Child and Adolescent Service Systems Program initiated by the National Institute of Mental Health in 1983 that provided communities with financial aid and technical assistance to better integrate and coordinate childserving agencies. The movement was solidified when Stroul and Friedman9 described the need for a comprehensive, community-based system of services and supports, which became known as “systems of care.” They defined systems of care as “a comprehensive spectrum of mental health and other necessary services which are organized into a coordinated network to meet the multiple and changing needs of children and their families” (p. 3). Significant progress in implementing systems of care nationally continued with the creation of the Comprehensive Community Mental Health Services for Children and Their Families Program, which provided funding to communities to develop systems of care across the nation.10 First authorized in fiscal year 1993, the program has funded 164 grantees nationally.11 The system of care construct is dynamic and has continued to evolve over the years. Stroul and Friedman9 initially described systems of care as a comprehensive spectrum of mental health and other necessary services organized into a coordinated network to meet the multiple and changing needs of children and their families. Stroul12 clarified the concept by emphasizing that first and foremost, systems of care are a range of treatment services and supports guided by a philosophy and supported by an infrastructure. Friedman and Hernandez13 noted that developing systems of care is neither a specific nor a simple intervention, and that it could be seen as a general statement of policy indicating a desire to establish a complex system targeted at a specific population of children and families based on a widely agreed upon set of principles and values. Hernandez and Hodges14 added that systems of care are better considered a cluster of organizational change strategies that are based on a set of values and principles intended to shape policies, regulations, funding mechanisms, services, and supports. These varying definitions and interpretations emphasize the complexity of the construct, as well as an emerging and developmental understanding related to the meaning of systems of care. Systems of care are based upon a set of core values that include the idea that services should be community based, child-centered and family-focused, and culturally competent. Guiding principles for systems of care also specify that services should be comprehensive; individualized to each child and family; provided in the least restrictive, clinically appropriate setting; coordinated at both system and service delivery levels; involve families and youth as full partners; and emphasize early identification and intervention.9,15 The systems of care construct recognizes that children and families have needs in many domains and promotes a holistic approach in which all life domains and needs are considered when serving children and their families, rather than addressing mental health treatment needs in isolation. Accordingly, the systems of care framework is organized around eight overlapping dimensions (e.g., family involvement, culturally competent services, service coordination, and integration), each representing an area of need for the child and family.9 Despite the numerous efforts to implement children’s systems of care nationally, little is known about the extent to which the principles and values actually have been put into practice nationally.

Development of a Measure to Assess

BOOTHROYD et al.

289

To date, no studies could be identified that have examined the implementation of children’s systems of care principles nationally. The national evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program has included a component designed to assess the extent to which grantees have implemented initiatives in accordance with systems of care program theory.16 A major goal of this aspect of the evaluation was the development of a qualitative “systemness” measure to assess whether services were delivered in an individualized, family-focused, coordinated, and culturally competent manner, and whether the system involved multiple child-serving agencies. The 2001 Annual Report to Congress stipulated that these system-of-care assessments of grantee sites funded between 1997 and 2000 consistently manifested systems of care principles and that communities generally scored highest on the factors related to family-focused, accessible, and individualized care. Grantee communities typically scored lowest on the factors of cultural competence and interagency collaboration.17 Vinson et al.18 presented the results of an expanded analysis using data from this measure collected from grantee sites funded during the first 5 years of this program. In short, they concluded that no local service systems that were able to fully implement all 16 factors conceptualized in the systems of care program theory and noted this was true of sites that had prior system-building experience and/or access to significant resources. Paulson, Fixsen and Friedman19 examined the level of implementation of systems of care principles in 14 grantee sites and attempted to describe what factors facilitated or inhibited the implementation of these principles. They concluded that grantee communities made good progress in implementing service delivery processes consistent with program theory but that these processes did not readily produce changes at the systems level. They further stated that the most successful communities implemented strategies and structure tailored to local community needs. This paper describes the development of the Systems of Care Implementation Survey (SOCIS), a measure designed specifically for the first ever study assessing the level of implementation of children’s systems of care in a nationally representative sample of counties throughout the United States. The study was important for several reasons. First, the survey data provided a baseline of implementation against which future research could evaluate its impact. Second, the findings also provided a baseline for assessing changes in systems of care implementation status. Such information offers an important benchmark for the children’s mental health field, much as surveillance data does for any public health field. It also offers practical information on the aspects of systems of care implementation that appear to require the most work. Unless such information is available, efforts to improve systems of care implementation and to assess progress will be left without adequate data either for evaluative purposes or for guiding the effort. Finally, the study created a foundation for additional research examining the relationship between the level of implementation of the factors contributing to the effectiveness of children’s systems of care and child and family outcomes. Study purpose In this paper, the design, development, and pilot testing of a data collection strategy for assessing level of implementation is described. The goal of this effort was to establish a strategy that would permit the collection of data on the level of implementation of children’s systems of care principles from the perspectives of multiple stakeholders in a nationally representative sample of counties.

Methods Development of the SOCIS The first task undertaken in this study focused on determining how information would be collected on the factors contributing to effective children’s systems of care. For example, the

290

The Journal of Behavioral Health Services & Research

38:3

July 2011

possibility of conducting site visits and interviewing key informants was considered. Although this approach would have provided breadth and depth related to the implementation of these factors, the time and costs constraints associated with this option made it unfeasible. Given that the study design was to include the perspectives of multiple stakeholders from a nationally representative sample of 225 communities throughout the U.S., a data collection strategy was needed that would not only be reliable and valid, but also would be practical, cost efficient, and could be readily completed by various types of informants. After consideration of various approaches, the decision was made to construct an objective-based questionnaire that could be either self-administered or conducted as an interview. Once this decision was made, the development of the SOCIS followed a systematic and multiphase process similar to the method recommended by Clark and Watson.20 The initial phase focused on conceptualization and review of the literature to develop the theoretical framework to guide decisions regarding which factors associated with implementation of systems of care factors would be assessed with the SOCIS. This phase was followed by the generation of an initial pool of items representing each factor that would be included on a draft version of the SOCIS. The third phase involved a pilot test of the SOCIS to both assess the initial psychometric properties of the SOCIS as well as the proposed data collection strategies for identifying potential informants and obtaining completed responses. During phase four, results from the psychometric analysis of the SOCIS were used to revise the measure and determine how best to collect data nationally. Finally, the SOCIS was administered to various stakeholders from a nationally representative sample of counties throughout the USA. Each of these steps is described in more detail below. Conceptualization and review of the literature: factors impacting implementation of children’s systems of care The Research and Training Center for Children’s Mental Health at the University of South Florida developed a conceptual model of 14 factors believed to be associated with the effective implementation of children’s systems of care.21 These factors were determined through a systematic multistep process. First, a comprehensive review of the research and theory on systems of care for children with serious emotional disturbances and their families was conducted15,18,22–29 Second, a comprehensive review of research and theory was conducted in related fields, such as comprehensive community initiatives,30,31 prevention,32–34 substance abuse,35,36 and program and organizational effectiveness.37–39 Third, factors were derived from the findings based on a national survey of state children’s mental health directors and concept mapping with a panel of experts in systems of care conducted to aid in generating Center activities. Fourth, determination of factors was partially based on the extensive experience of Center faculty and staff who had been conducting research within systems of care for upwards of 25 years, and providing consultation and technical assistance to leaders of systems. Finally, feedback on the factor model from the Center’s Board of Advisors, parents, and professional leaders in children’s mental health was incorporated. It is important to note that many of these 14 factors closely parallel the original core values and principles emphasized by Stroul and Friedman9 in their seminal work. However, it is also important to recognize that there is not a one-to-one correspondence between these 14 factors and the original values and principles underlying children’s systems of care. This is due in part to two reasons. First, as previously noted, the construct of systems of care has continued to evolve since Stroul and Friedman’s initial work.9 Moreover, there has been a recognition that in addition to these core value and principles is a need for communities to have an infrastructure (e.g., skilled provider network) to successfully implement these values and principles. Therefore, some of these 14 factors reflect the community infrastructure needs that have been determined as necessary to implement effective systems of care.

Development of a Measure to Assess

BOOTHROYD et al.

291

This process resulted in a theoretical framework that included the 14 implementation factors believed to contribute to effective systems of care. The 14 factors, their final definitions, and references supporting their importance for their inclusion in the SOCIS are summarized in Table 1 below. SOCIS item generation Following the identification and construction of the 14 implementation factors and their definitions, teams with expertise in each of these 14 areas, including parents, revised the initial factor definitions, and then generated a number of items they believed were important to comprehensively assess each factor. This process resulted in the generation of 106 potential items for inclusion in the initial version of the SOCIS. The project team edited the proposed items for redundancy and structured them in a common response format for ease of administration. Most of the items asked informants to rate the level of implementation on a scale of 1–5, with 5 indicating greater implementation while the remaining items asked informants to indicate yes or no. Items were then formatted in the initial draft of the SOCIS. Once the draft of the SOCIS was completed, it was reviewed by a national panel of 16 individuals, also including parents and child services professionals. Unlike the initial experts who constructed the items contained on the SOCIS, this panel evaluated the factor definitions and survey items on three dimension: clarity, construct fit (i.e., centrality), and importance. In addition, panel members were asked to provide qualitative comments on the items and factor definitions. SOCIS survey pilot testing and data collection procedures During fall 2005 through spring 2006, a pilot test was conducted to assess the (1) adequacy of the draft SOCIS protocol, (2) feasibility of the proposed data collection procedures, and (3) amount of time that was required to obtain completed survey responses. Seven counties in the USA were randomly selected based on population size: (a) Kings, NY, (b) Lake, IL, (c) Arapahoe, CO, (d) Yuma, AZ, (e) Floyd, IN, (f) Fayette, IN, and (g) Conway, AR. To recruit informants for pilot testing from these seven counties, e-mails were sent to the state directors of children’s mental health services informing them about the study, specifying the county(ies) selected in their state, and asking for a contact in each county to help identify informants. FMHI staff then called county contacts to seek assistance in identifying potential informants. When connections were made, county contacts were helpful in identifying potential SOCIS informants. However, connecting with these proved time consuming. On average, five calls over a 7-day period were necessary to obtain a completed survey. To decrease this time, the use of web-based searches was pilot tested. These efforts proved useful in identifying informants from the mental health and education systems and were incorporated into data collection procedures. The pilot testing of the SOCIS involved two procedures. First, a total of 38 usable surveys were obtained from informants representing the four child-serving systems (i.e., mental health, school, family organizations, and other child-serving systems) completed the draft version of the SOCIS. Multiple strategies were available and piloted for obtaining SOCIS responses, including telephone interviews, email attachments, faxed copies, and mailing hard copies with stamped return envelope. Second, 13 different informants (three family organization, 10 mental health administrators and clinicians) participated in cognitive interviewing during which they discussed their reactions to and understanding of each item. Cognitive interviewing is a technique that attempts to examine the cognitive processes a participant is using when answering survey items.40 The process differs from more traditional survey pretesting methods that typically ask participants to simply answer the survey items, by asking them to provide specific feedback regarding the wording of the survey items. Although there are different forms of cognitive interviewing (e.g., think-aloud, verbal

292

The Journal of Behavioral Health Services & Research

38:3

July 2011

Table 1 System of care implementation factors Factor

Definition and references

1. Family choice and voice

Family and youth perspectives are actively sought and given high priority during all planning, implementation, and evaluation of the service system10,47–50 A range of services that is available to support the development of individualized, culturally competent, and comprehensive treatment plans that assist the child and the entire family. Individualized treatment is when the services provided are based on the specific needs and strengths of individual children and their families 47–50 Comprehensive treatment addresses functioning across the full array of life domains. Culturally competent treatment addresses the specific cultural/racial/language characteristics of the family, community, and service providers that impact treatment plan effectiveness7,10,47,48 Outreach and service access are procedures (e.g., home visits, mental health workers in the schools) that facilitate obtaining care for all individuals in the identified population of concern7,51,52 Transformational leaders are individuals who, articulate a long-term vision that inspires others, challenge assumptions and take risks, and listen to the concerns and needs of others28,53–56 A theory of change is the expressed beliefs and assumptions for how to serve child and adolescent populations and reach identified goals. Policy57–61 An implementation plan identifies procedures and strategies to achieve goals and objectives at program and system levels and includes projected timelines and expected outcomes62–64 The intended beneficiaries of the service system (i.e., the local population of concern) should be clearly described. Specific information should include the number of children and adolescents who are eligible for services, their ages, diagnostic profiles, demographics including cultural/racial/language diversity, location in the county, services histories and any special needs of groups in the population65–67 A formal process concerned with facilitating collaboration among the various child-serving sectors (e.g., mental health, education, child welfare, juvenile justice). This process usually includes an interagency committee, which has designated participants who represent the various agencies and have regularly scheduled meetings9,15,68–71

2. Individualized, comprehensive and culturally competent treatment

3. Outreach and access to care

4. Transformational leadership

5. Theory of change

6. Implementation plan

7. Local population of concern

8. Interagency and cross-sector collaboration

Development of a Measure to Assess

BOOTHROYD et al.

293

Table 1 (continued) Factor

Definition and references

9. Values and principles

Values and principles refer to an explicit statement of core values and principles that guide system development and evaluation. These values and principles have been adopted through an inclusive, participatory process. For example, core values may include: Child-centered and family-driven: The needs of the child and family dictate the services provided. Community-based services: management and decision-making responsibility reside at the community level. Culturally competent: agencies, programs, and services are responsive to the cultural, racial, and language diversity9,72–74 A comprehensive financing plan is consistent with the goals of the system, identifies expenditures across major child-serving sectors, utilizes varied sources of funding, promotes fiscal flexibility, maximizes federal entitlements, and re-directs spending from restrictive placem ents to home- and community-based services10,47,75 A skilled provider network represents an assessment of the group of service providers that populate a particular system. They should be diverse in background, culturally competent, effective in providing services, behave consistent with the values and principles promoted by the system, and have sufficient capacity to provide family choice76–80 The ongoing monitoring of program/system accomplishments, particularly progress towards preestablished goals. Performance measurement systems involve regularly collected data on the level and type of program/system activities (process), the direct products and services delivered by the programs (outputs), and the results of these activities (outcomes)81–83 Funding for providers is tied to their performance so that incentives have been created for high quality and family-responsive outcomes84 Management and governance refers to decision-making individuals and groups that are responsible for maintaining the system’s values, principles, goals, and strategies. They use data and stakeholder input to manage and continuously strengthen and improve the system19,37,56

10. Comprehensive financing plan

11. Skilled provider network

12. Performance measurement system

13. Provider accountability

14. Management and governance

294

The Journal of Behavioral Health Services & Research

38:3

July 2011

probing; see Jobe and Mingay),41 field testing of the SOCIS was conducted using a verbalprobing approach designed to assess informants’ (a) comprehension (e.g., “What does this phase “your leadership” mean to you?”), (b) memory retrieval (e.g., “How did you arrive at your answer to this item?”), and (c) decision processes (e.g., “How hard was this item to answer?”). Prior to the start of any data collection, all study procedures, scripts, and protocols were reviewed and approved by the University of South Florida’s Institutional Review Board to ensure that all study procedures protect and respect the right of study participants.

Results The results of the pilot testing of the SOCIS are organized into four sections: (1) experts assessment on the initial version of the SOCIS, (2) findings associated with the psychometric analysis of the SOCIS, (3) the results derived from the cognitive interviews, and (4) lessons learned regarding our data collection strategies.

Panel members’ assessment of the SOCIS Panel members’ ratings concerning item clarity ranged from 3.14 to 4.87 on a 5-point scale with 5 representing greater clarity. In terms of item centrality, panel members’ ratings ranged from 3.67 to 4.87 with 5 representing increased centrality. Ratings of item importance ranged from 3.69 to 4.86. All items that had a mean rating less than 4.00 on any dimension were examined extensively. Based on the expert panel members’ ratings and qualitative comments, factor definitions and survey items were modified. This resulted in a revised version of the SOCIS.

Psychometric analysis of the SOCIS pilot test A psychometric analysis was performed on the data from the pilot test and primarily focused on answering the question of how well the items comprising each factor actually “hung together” as hypothesized. This was assessed by calculating coefficient alpha for each factor. Results of this analysis based on responses from 38 informants indicated that the vast majority of the 106 items and 14 factors performed well during the pilot testing of the SOCIS. Internal consistency reliability of 12 of the 14 factors had reliability estimates exceeding 0.70 and values for all factors ranged from 0.605 to 0.876 (see Table 2 for the coefficient alpha and number of items for each factor). According to Nunnally and Bernstein,42 reliability coefficients of 0.70 or higher are acceptable for making most decisions. The alpha for the overall SOCIS was acceptable, averaging 0.769 across all 14 the factors. A second set of analyses were performed to determine informants’ ability to answer the survey items. Interviewees noted that when informants indicated that no formal system of care existed in their county, they experienced a lot of difficulty responding to items that invole systems of care terminology. Results of this analysis revealed that 20 of the SOCIS survey items had more than 20% informants answering don’t know, while eight items had more than 33% of the informants answering don’t know. Items with high proportion of don’t knows tended to be in four factors; local population of concern, values and principles, financing, and performance measurement system. Two factors had a high proportion (950%) of informant answering don’t know to at least one item in the factor; comprehensive financing plan (57%) and performance measurement system (62%). Given the difficulty experienced by some informants in answering some of the survey items, an item was added that directly asked informants if there was a formal systems of care in their county; if they responded no, the term systems of care was redefined (e.g., if no, SOC, please rate the organization that provides services to children with mental health needs in your county).

Development of a Measure to Assess

BOOTHROYD et al.

295

Table 2 Pilot interview: Cronbach alphas for the implementation factors (N=38) Original

Revised

Factor

No. of items

α

No. of items

α

Family choice and voice Individualized, comprehensive, and culturally competenta Outreach/access to care Transformational leadership Theory of change Implementation plan Local population of concern Interagency and cross-sector collaborationa Values and principles Comprehensive financing plan Skilled provider network Performance measurement system Provider accountability Management and governance Total

5 10

0.769 0.898

5 6

0.769 0.866

5 11 6 7 4 12 7 8 5 8 4 4 106

0.606 0.907 0.836 0.784 0.346 0.895 0.112 0.879 0.654 0.758 0.700 0.819 0.817

3 5 5 5 3 6 5 6 4 5 4 4 66

0.777 0.837 0.840 0.727 0.620 0.876 0.605 0.862 0.692 0.778 0.700 0.819 0.769

a Two additional items were added to these two factors after the pilot testing to fill in perceived gaps resulting in a SOCIS that contained 70 items

Cognitive interviews Thirteen cognitive interviews were completed with informants representing child-serving professionals and parents. The general consensus of these informants was that the majority of survey items were clear, understandable, answerable, and not too difficult. Interviewees’ feedback indicated that the wording used for some terms used in the survey were too vague or awkward. In these cases, the terms were either rephrased or replaced. A number of family organization informants commented that they experienced some difficulty answering items on several specific factors (e.g., performance measurement system, management and governance, and comprehensive financing plan). Based on this feedback, as well as the analysis of informants’ ability to answer the survey items (see above) skip sections were included so that when informants felt they did not have sufficient information to answer they would skip the items in that section. In addition, some informants thought they were going to be asked about their children, rather than the service system more broadly, so further clarification was provided on this issue in the survey instructions.

Lessons learned A number of lessons were learned during the pilot testing of the SOCIS that resulted in concrete changes and revisions to the survey and the data collection processes. One issue that emerged was related to identifying appropriate informants to complete the SOCIS. This challenge was due in part to different organizational structures associated with various child-serving systems. For example, in smaller counties providers, school districts, and family organizations were more likely

296

The Journal of Behavioral Health Services & Research

38:3

July 2011

to be regionally-based, as opposed to county-based. Finding the appropriate regional entities was challenging. Additionally, informants expressed difficulty restricting responses to a specific county in contrast to the region served. In larger counties, multiple child-serving systems (e.g., multiple school districts within a county) existed, creating challenges identifying appropriate informants. In this situation, informants expressed difficulty broadening their perspectives to the entire county as opposed to the area served. Additionally, some informants suggested that some sections of the survey were too discontinuous and “jumped around.” As a result, these sections of the SOCIS were reordered to improve flow. Informants’ feedback also suggested the survey was too long. Therefore, the number of implementation factor items was decreased from 96 to 66 (excluding the 10 demographic questions), a reduction of over 25%. Of the 30 items that were dropped from the draft version of the SOCIS; 13 were eliminated because the Cronbach alphas on the scales to which they belonged were improved as a result of their deletion from the scale. For example, Outreach and Access to Care, the item “Is there a mechanism to assess mental health needs” was deleted because the alpha for the scale increased from 0.606 to 0.646 when the item was deleted (the corrected item–total correlation for the dropped item was 0.054). Following the subsequent addition to fill in perceived gaps in the existing factor content (two each for the Interagency Collaboration factor and the Individualized, Comprehensive, and Culturally Competent Treatment factor), 17 more items were deleted to reduce the total number of items to 70 and maintain approximately five items per factor. Additionally, a number of informants reported difficulty finding the time to complete the interview over the phone. Experience of the interviewees indicated that completed surveys required and average of nine telephone calls and in excess of 5 hours of interviewer time. As a result, another option was created to facilitate study participation and to reduce staff time; the SOCIS was made available as a web-based survey (i.e., Survey Monkey®) so that potential informants could complete the survey at the leisure and that automated reminders could be sent to prospective informants, thus reducing staff time.

Discussion Summary This paper describes the development and pilot testing of the Systems of Care Implementation Survey that was designed specifically for the first ever study assessing the level on implementation of children’s systems of care in a nationally representative sample of counties throughout the U.S. In addition, the pilot test examined the proposed procedures for obtaining survey responses from this nationally representative sample. The findings from the pilot test provided sufficient psychometric evidence to support its use in a national study involving a representative sample of counties. The internal consistency reliabilities of 12 of the 14 factor were more than acceptable (i. e., 90.70), and a factor analysis conducted on the pilot data strongly supported the hypothesized factor structure. See related paper in this issue43 for a detailed description of the sampling procedures used to obtain a nationally representative sample of counties as well as the final psychometric properties of the SOCIS and factor analytic results on the national sample. As was previously noted, a number of revisions were made to both the SOCIS based on the results of this pilot test. Principal among these revisions were clarification of some of the implementation factor definitions, shortening of the survey, and permitting informants to omit sections that they did not feel qualified to answer. In addition to changes in the survey, a number of changes were adopted associated with the data collection procedures for the national study. Most notably, a web-based version of the SOCIS was developed using the Survey Monkey® interface. Although no formal attempt was made during this pilot study to empirically assess the construct validity of the SOCIS, data from the national sample of respondents could support such efforts. For

Development of a Measure to Assess

BOOTHROYD et al.

297

example, the ability of the SOCIS to discriminate among funded systems of care grantee sites and communities that have not received funding would provide support both for the Comprehensive Community Mental Health Services for Children and Their Families Program as well as some evidence of the SOCIS construct validity. In addition, construct validity could be assessed by determining if communities with higher levels of implementation of systems of care principles as assessed by the SOCIS also have enhanced children’s social indicators relative to communities with lower levels of implementation (i.e., convergent validity).

Limitations The inclusion of multiple informants presented a challenge in the development of the SOCIS. Despite widespread agreement among researchers regarding the importance of obtaining information about certain attributes from multiple informants44 (e.g., mental health, education, parents), convergence of data from across informants is difficult to achieve. This is due to a variety of reasons, such as differences in informants’ level of knowledge regarding the attribute being assessed and how their assessment of the attribute is colored by their own personal experiences. In the current study, inclusion of multiple perspectives increased within county variance, which resulted in decreased reliability of county estimates of the level of implementation of the 14 factors. Despite this fact, it is likely the inclusion of a larger number of informants within each county would minimize this issue. This, however, would require additional resources that were not accessible for the current study. A second challenge was related to the differing geographical structures and boundaries associated with the various child-serving systems whose perspectives we sought in this study. Although counties were chosen as the unit of measure because a systematic sampling frame could be developed and implemented, the delivery of mental health, child welfare, educational, and other children’s services often do not comport with county boundaries nationally. Moreover, within the same community, the geographical jurisdictions often differ across child-serving systems. This created challenges for informants in restricting or broadening their assessment. For some informants, our request for a county-level assessment required them to assess a limited portion of the geographical area for which they had jurisdiction (e.g., in very rural areas), while for other informants, the county boundaries required them to provide an assessment that was much broader in scope than the area for which they had actual responsibility (e.g., Los Angeles county). This almost certainly served to additionally increase the within-county variability associated with informants’ level of implementation of the factors contributing to effective systems of care. The complexity of and variability among the implementation factors assessed also presented challenges for informants. Despite providing informants with specific definitions of each implementation factor, the task of rating their community’s level of implementation on access to Individualized, Comprehensive and Culturally Competent Treatment and the community have a Comprehensive Financing Plan is complex and required informants to be knowledgeable about a broad range of issues in their communities. Finally, the sample size associated with the pilot testing of the SOCIS was limited (i.e., 38 respondents). Nevertheless, findings from this pilot test informed revisions to the instrument prior to its use in the national evaluation. It is important to note that this paper only summarizes the development and pilot work associated with the SOCIS. Promising estimates of psychometric properties using this small sample suggested that use of the instrument with a large, nationally representative sample that includes a sufficient number of informants in each county would not only improve the psychometric properties of the measure, but would also provide robust estimates regarding the level of implementation of factors associated with successful systems of care.

298

The Journal of Behavioral Health Services & Research

38:3

July 2011

Implications for Behavioral Health It was noted many years ago that a vexing problem facing program sponsors and evaluators is how to establish and interpret performance monitoring systems.45 Despite the federal government’s $1.5 billion investment since 1993 to develop systems of care in 164 sites nationally,46 little is known about the extent to which the principles and values of children’s systems of care have been implemented in communities outside these funded grantee sites. As previously noted, the national evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program assesses the extent to which grantees have implemented principles consistent with systems of care program theory.16 Findings summarized in a report to Congress17 and two published articles18,19 that summarized these results concluded that grantee communities made good progress implementing these principles but that the level of implementation varies among principles and across sites. To date, assessment of implementation of the principles in a nationally representative sample has not been possible due to an adequate measure and associated data collection strategy did not exist. In fact, Vinson et al.18 noted their “…exploratory efforts to develop a tool to assess implementation quantitatively failed to meet reliability thresholds…” (p. 30). This study developed and pilot-tested a quantitative measure and established a data collection strategy specifically for this purpose—to assess the level of implementation of children’s systems of care principles from the perspectives of multiple stakeholders nationally. The study was important because the findings reported in the remaining three papers in this special section81,85 provide sponsors and evaluators with a national estimate of the current level of implementation of systems of care principles against which results from future studies can be compared and benchmarks developed. Though the use of benchmarks and comparative standards are not without critics who argue that they result in competition and reduce cooperation,45 when used appropriately, benchmarks permit us to understand where we are, which is an important and critical step in getting where we want to go.

Acknowledgments This research was partially supported by Grant H133B90004 from the Center for Mental Health Services, Substance Abuse and Mental Health Services Administration, the National Institute for Disability and Rehabilitation Research, and the National Institute of Mental Health (NIMH) (R01 MH70680-01A). The opinions contained in this manuscript are those of the authors and do not necessarily reflect those of either the US Department of Education, the Center for Mental Health Services, SAMHSA, or NIMH.

References 1. President’s Commission on Mental Health. Report of the Sub-task Panel on Infants, Children and Adolescents, Washington, D.C. 1978. 2. US Congress. Children’s Mental Health: Problems and Services—a Background Paper, Washington, D.C.: Office of Technology Assessment; 1986 3. President’s New Freedom Commission on Mental Health. Achieving the promise: transforming mental health care in America, Rockville, MD: Department of Health and Human Services; 2003. 4. US Public Health Service Report of the Surgeon General’s Conference on Children’s Mental Health. A National Action Agenda, Washington, D.C.: Department of Health and Human Services; 2000 5. Friesen BJ, Huff B. Family perspectives on systems of care. In: Stroul BA, ed. Children’s Mental Health: Creating Systems of Care in a Changing Society. Baltimore, MD: Paul H. Brookes Publishing Co., Inc; 1996:41–68. 6. Isaacs-Shockley M, Cross T, Bazron B, et al. Framework for a culturally competent system of care. In: Stroul BA, ed. Children’s mental health: creating system of care in a changing society. Baltimore, MD: Paul H. Brookes Publishing Co, Inc.; 1996:23–40 7. US Department of Health and Human Services, Office of Minority Health. (2001). Assuring Cultural Competence in Health Care: Recommendations for National Standards and an Outcomes-focused Research Agenda, Rockville, MD: IQ Solutions, Inc. Available at: http://www.omhrc.gov/clas/index.htm. Accessed on 4 Jan 2005.

Development of a Measure to Assess

BOOTHROYD et al.

299

8. Stroul BA, Pires SA, Armstrong MI. Health Care Reform Tracking Project: tracking state health care reforms as they affect children and adolescents with behavioral health disorders and their families—2000 state survey. Tampa, FL: Research and Training Center for Children's Mental Health, Department of Child and Family Studies, Division of State and Local Support, Louis de la Parte Florida Mental Health Institute, University of South Florida; 2001 9. Stroul BA, Friedman RM. A system of care for severely emotionally disturbed children and youth, Washington, D. C.: Georgetown University, CASSP Technical Assistance Center; 1986. 10. Center for Mental Health Services (CMHS). Annual report to Congress on the evaluation of the comprehensive community mental health services for children and their families program, executive summary, 2004. Atlanta, GA: ORC Macro; 2004 11. Blau, GM, Huang, LN, Mallery, CJ. Advancing efforts to improve children’s mental health in America: a commentary. Administration and Policy in Mental Health and Mental Health Services Research, 2010; 37:140–144. 12. Stroul BA. Issue brief. Systems of care: a framework for system reform in children’s mental health, Washington, D.C.: National Technical Assistance Center for Children’s Mental Health, Georgetown University Center for Child and Human Development; 2002 13. Friedman RM, Hernandez M. The national evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program: a commentary. Children’s Services: Social Policy, Research, and Practice 5(1): 2002; 67–74. 14. Hernandez M, Hodges S. Building upon the theory of change for systems of care. Journal of Emotional and Behavioral Disorders, 2003; 11:19–26. 15. Stroul BA, Friedman R. The system of care concept and philosophy. In: Stroul, BA, ed. Children’s mental health: creating systems of care in a changing society. Baltimore, MD: Paul H. Brookes Publishing Co, Inc; 1996; 591–612 16. Holden EW, Friedman RM, Santiago RL. Overview of the national evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program. Journal of Emotional and Behavioral Disorders, 2001; 9:4–12. 17. Center for Mental Health Services Annual Report to Congress on the Evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program, 2001. Atlanta, GA: ORC Macro, 2001. 18. Vinson NB, Brannan AM, Baughman LN, et al. The system-of-care model: implementation in twenty-seven communities. Journal of Emotional and Behavioral Disorder. 2001; 9:30–42. 19. Paulson R, Fixsen D, Friedman R. An analysis of implementation of systems of care at fourteen CMHS grant communities. Tampa, FL: Louis de la Parte Florida Mental Health Institute, University of South Florida, 2004 20. Clark LA, Watson D. Constructing validity: basic issues in objective scale development. Psychological Assessment. 1995; 7:309–319. 21. Friedman RM. A model for implementing effective systems of care. In: C Newman, C Liberton K Kutash, RM Friedman eds. The 18th Annual Research Conference Proceedings, A System of Care for Children's Mental Health: Expanding the Research Base. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, Research and Training Center for Children’s Mental Health. 2007; 3–9. 22. Brannan AM, Baughman LN, Reed ED. System-of-care assessment: cross-site comparison of findings. Children's Services, Social Policy, Research, and Practice. 2002; 5:37–56. 23. Friedman RM, Fixsen D, Paulson R. Implementing effective systems of care: How can we improve? Paper presented at the 11th Annual Building on Family Strengths Conference, Portland, OR; 2004, May. 24. Holden EW, De Carolis G, Huff B. Policy implications of the national evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program. Children's Services, Social Policy, Research, and Practice. 2002; 5:57–66. 25. Meridian Consulting Services, Inc. Integrated systems of care for children’s mental health: a technical assistance resource book. Albany, NY: Author; 1999. 26. Pires SA. Primer hands on: the skill building curriculum for system of care leaders. Washington, DC: Georgetown University, National Technical Assistance Center for Children’s Mental Health, Center for Child and Human Development; 2003 27. Pumariega AJ, Winters NC, Huffine C. The evolution of systems of care for children’s mental health: forty years of community child and adolescent psychiatry, Community Mental Health Journal. 2003; 39:399–425. 28. Rosenblatt A, Woodbridge M. Deconstructing research on systems of care for youth with EBD: frameworks for policy research, Journal of Emotional and Behavioral Disorders. 2003; 11:27–38. 29. Stephens RL, Holden EW, Hernandez M. System-of-care practice review scores as predictors of behavioral symptomatology and functional impairment. Journal of Child and Family Studies. 2004; 13:179–191. 30. Gray B, Duran A, Segal A. eds. Revisiting the critical elements of Comprehensive Community Initiatives. Washington, DC: US Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation; 1997 31. Kubisch AC, Auspos P, Brown P, et al. Voices from the field II: Reflections on comprehensive community change. Queenstown, MD: Aspen Institute; 2002. 32. Bond LA, Hauf AMC. Taking stock and putting stock in primary prevention: characteristics of effective programs, The Journal of Primary Prevention. 2004; 24:199–221. 33. Nation M, Crusto C, Wandersman A, et al. What works in prevention: principles of effective prevention programs. American Psychologist. 2003; 58:449–456. 34. Wandersman A, Florin P. Community interventions and effective prevention. American Psychologist 2003; 58:441–448. 35. Chinman M, Imm P, Wandersman A.Getting to outcomes 2004: promoting accountability through methods and tools for planning, implementation, and evaluation. Santa Monica, CA: RAND Corporation; 2002. 36. Wandersman AP, Imm P, Chinman M, et al. (2000). Getting to outcomes: a results-based approach to accountability, Evaluation and Program Planning. 2003; 23:389–395 37. Collins J. Good to great: why some companies make the leap…and others don’t. New York: HarperCollins Publishers, Inc.; 2001. 38. Greenberg M. Research-based programs and lessons of implementation. In: C. Bruner, M. Greenberg, C. Guy, M. Little, H. Weiss, and L. Schorr eds. Funding what works: Exploring the role of research on effective programs and practices in government decision-making. Des Moines, IA: National Center for Service Integration Clearinghouse and the Center for Schools and Communities; 2001; 7–18. 39. Weiss H. (2001). Making progress: learning and public accountability. In: C Bruner, M Greenberg, C Guy, M Little, L Schorr, H Weiss eds., Funding what works: exploring the role of research on effective programs and practices in government decisionmaking.

300

The Journal of Behavioral Health Services & Research

38:3

July 2011

40. 41. 42. 43. 44. 45. 46. 47.

48. 49. 50.

51. 52.

53. 54. 55. 56.

57. 58. 59. 60.

61. 62. 63.

64. 65. 66. 67. 68. 69. 70. 71.

[Monograph]. Des Moines, IA: National Center for Service Integration Clearinghouse and the Center for Schools and Communities, 2001; 23–30 Jobe JB, Mingay DJ. Cognitive research improves questionnaires. American Journal of Public Health. 1989; 79:1053–1055. Nunnally JC, Bernstein IH. Psychometric theory 3rd ed, New York: McGraw-Hill, 1994. Greenbaum PE, Wang W, Boothroyd R, et al. Multilevel confirmatory factor analysis of the systems of care implementation survey (SOCIS). Journal of Behavioral Health Services and Research. 2011; 38(3). Kraemer HC, Measelle JR, Ablow JC, et al. A new approach to integrating data from multiple informants in psychiatric assessment and research: mixing and matching contexts and perspectives. American Journal of Psychiatry. 2003; 160:1566–1577. Henry GT, McTaggart MJ, McMillan JH. Establishing benchmarks for outcome indicators: a statistical approach to developing performance standards. Evaluation Review. 1992; 16:131–150. ICF Macro (2011). Children’s Mental Health Initiative (CMHI): National evaluation of the comprehensive community mental health services for children and their families program. Retrieved from: http://www.macrointernational.com/projects/cmhi/default.aspx Kutash K, Greenbaum P, Wang W, et al. Levels of system of care implementation: a national benchmarking study. Journal of Behavioral Health Services and Research. 2011; 38(3). Center for Mental Health Services (CMHS). (2002). Cooperative agreements for the Comprehensive Community Mental Health Services for Children and Their Families Program, Guidance for Applicants (GFA) (No. SM-02-002 part I—programmatic guidance). Rockville, MD: Substance Abuse and Mental Health Services Administration. Also available online from http://alt.samhsa.gov/grants/content/2002/ 2002grants.htm Hernandez M, Gomez A. (2002). System of Care Practice Review. Tampa: University of South Florida, Department of Child and Family Studies. Available at: http://cfs.fmhi.usf.edu/tread/misc/socpr.htm. Accessed on 4 Jan 2005. ORC Macro. System-of-care assessment 2001: Site visitor training manual. Atlanta, GA: ORC Macro; 2001. Walker JS, Koroloff N, Schutte K. Implementing high-quality collaborative individualized service/support planning: Necessary conditions. Portland, OR: Research and Training Center on Family Support and Children’s Mental Health, Portland State University, 2003 Ringel JS, Sturm R. National estimates of mental health utilization and expenditures for children in 1998, Journal of Behavioral Health Services & Research. 2001; 28:319–333. US Department of Health and Human Services. Mental health: A report of the Surgeon General. Rockville, MD: US Department of Health and Human Services, Substance Abuse and Mental Health Services Administration, Center for Mental Health Services, National Institutes of Health, National Institute of Mental Health; 1999 Copeland MA. Leadership of inquiry: building and sustaining capacity for school improvement. Educational Evaluation & Policy Analysis. 2003; 25:375–395. Duchnowski AJ, Kutash K, Oliveira B. A systematic examination of school improvement activities that include special education. Remedial and Special Education. 2004; 25:117–129. Friedman RM. (1996). New forms of leadership. In: The case for kids. Tampa: University of South Florida, Department of Child and Family Studies, 19–20 Hodges S, Hernandez M, Nesman T, et al. Creating change and keeping it real: How excellent child-serving organizations carry out their goals. Tampa: University of South Florida, Louis de la Parte Florida Mental Health Institute, Research and Training Center for Children's Mental Health, 2002 Alter C, Murty S. Logic modeling: a tool for teaching practice evaluation. Journal of Social Work Education. 1997; 33:103–117. Aronson SR, Mutchler SE, Pan DT. Theories of change: making programs accountable and making sense of program accountability. Austin, TX: Southwest Educational Development Laboratory, 1998. Chen HT. Theory-driven evaluations. Newbury Park, CA: Sage, 1990. Connell JP, Kubisch AC. Applying a theory of change approach to the evaluation of comprehensive community initiatives: progress, prospects, and problems. In: K Fulbright-Anderson, A Kubisch, JP Connell eds., New approaches to evaluating community initiatives: theory, measurement, and analysis. Washington, DC: The Aspen Institute, 1998; 15–44. Hernandez M, Hodges S. Crafting logic models for systems of care: ideas into action. Tampa: University of South Florida, Louis de la Parte Florida Mental Health Institute, Department of Child and Family Studies, 2003 Fixsen DL, Blase KA. Taking evidence-based programs to scale. Paper presented at the National Association of State Mental Health Program Directors Conference, Arlington, VA, 2004. Fixsen DL, Naoom SF, Blase KB, et al. A review and synthesis of the literature related to implementation of programs and practices. Tampa, FL: National Implementation Research Network, Louis de la Parte Florida Mental Health Institute, University of South Florida, 2005 Rosenblatt A, Wyman N, Kingdon D, et al. Managing what you measure: creating outcome-driven systems of care for youth with serious emotional disturbances. The Journal of Behavioral Health Services and Research. 1998; 25:177–193. Fixsen DL, Blase KA. Fundamentals of program implementation. Tampa, FL: Louis de la Parte Florida Mental Health Institute, University of South Florida, 2003 Rossi P, Freeman H. Evaluation: a systematic approach, 5th edition. Newbury Park, CA: Sage Publications, Inc., 1993. Burns BJ, Costello EJ, Angold A, et al. Children’s mental health service use across service sectors. Health Affairs. 1995; 14:147–159. Hoagwood K, Johnson J. School psychology: a public health framework: I. From evidence-based practices to evidence-based policies. Journal of School Psychology. 2003; 41:3–21. Kutash K, Duchnowski A. Create comprehensive and collaborative systems. Journal of Emotional and Behavioral Disorders. 1997; 5:66–75. Ringeisen H, Henderson K, Hoagwood K. Context matters: schools and the “research to practice gap” in children's mental health. School Psychology Review. 2003; 32:153–168. Armstrong JS. Strategic planning improves manufacturing performance. Long Range Planning. 1991; 24:127–129.

Development of a Measure to Assess

BOOTHROYD et al.

301

72. Furlong MJ, Woodbridge MW, Sosna T, et al. Santa Barbara County multiagency integrated system-of-care project. In: M Epstein, K Kutash AJ Duchnowski eds, Outcomes for children with emotional and behavioral disorders and their families: Program and evaluation best practices 2nd ed, Austin, TX: Pro-Ed, 2005; 329–353. 73. Kamradt B, Gilbertson S, Lynn N. Wraparound Milwaukee: program description and evaluation. In: M Epstein, K Kutash, AJ Duchnowski eds, Outcomes for children with emotional and behavioral disorders and their families: program and evaluation best practices 2nd ed, Austin, TX: Pro-Ed, 2005; 307–328. 74. US Department of Health and Human Services. Sustainability Tool Kit. Rockville, MD: US Department of Health and Human Services, Substance Abuse and Mental Health Services Administration, Center for Mental Health Services, 2003 75. Friedman RM. Children’s mental health: a status report and call to action. Invited testimony presented at the meeting of the President’s New Freedom Commission on Mental Health, Washington, D.C., 2002. 76. Friedman RM. Expanding the framework for developing, operating, and sustaining local systems of care. Paper presented at the Training Institutes on Developing Local Systems of Care for Children and Adolescents with Emotional Disturbances and Their Families: Family Involvement and Cultural Competence, Washington, D.C., 2002. 77. Friedman RM. Creating informed choice for families: The link between individualized care, data-based and value-based systems of care, and evidence-based practice. Paper presented at the 11th Annual Building on Family Strengths Conference, Portland, OR, 2004. 78. Hoza B. Psychosocial treatment issues in the MTA: a reply to Greene and Ablon. Journal of Clinical Child Psychology. 2001; 30:126– 130. 79. Kern L, Mantegna ME, Vorndran CM, et al. Choice of task sequence to reduce problem behaviors. Journal of Positive Behavioral Interventions. 2001; 3:3–10. 80. Deming WE. My view of quality control in Japan, Reports of Statistical Application Research. 1975; 22:73–80. 81. Donabedian A. Explorations in quality assessment and monitoring: the definition of quality and approaches to its assessment. Ann Arbor, MI: Health Administration Press, 1980. 82. Hermann RC, Regner JL, Erickson P, et al. Developing a quality management system for behavioral health care: The Cambridge Health Alliance experience. Cambridge, MA: President and Fellow of Harvard College, 2000. 83. Behar L, Friedman R, Lynn N. Expanding the framework for systems of care: a study of systems with extensive provider networks. Tampa, FL: University of South Florida, The Louis de la Parte Florida Mental Health Institute, 2003 84. Collins D. Pretesting survey instruments: an overview of cognitive methods. Quality of Life Research. 2003; 12:229–238. 85. Lunn LM, Heflinger CA, Wang W, et al. Community characteristics and implementation factors associated with effective systems of care. Journal of Behavioral Health Services and Research. 2011; 38(3)

302

The Journal of Behavioral Health Services & Research

38:3

July 2011