Potential Guidelines for Conducting and Reporting ...

0 downloads 0 Views 196KB Size Report
was offered (Simmons et al., 1990, p. 79). The outcome ..... Dr Ian Robottom, Ms Connie Ruth, Dr Carol Saunders, Dr Danie Schrelider, Dr Bonnie. Shapiro, Ms ...
Environmental Education Research, Vol. 6, No. 1, 2000

Potential Guidelines for Conducting and Reporting Environmental Education Research: qualitative methods of inquiry

N.J. SMITH-SEBASTO University of Illinois at Urbana-Champaign,Urbana, IL, USA

This article presents guidelines for conducting and reporting qualitative EE research developed during a 10-hour (1î -day) workshop sponsored by the North American Commission on Environmental Education Research during the 1997 annual meeting of the North American Association for Environmental Education. They represent the efforts of 30 EE practitioners, researchers,and scholars, and are offered in the spirit of academic and intellectual collaboration and cooperation. It is hoped that by sharing the efforts of the workshop participants with the much broader EE research community unable to participate in the workshop, opportunities for comment and, if necessary, revision of the guidelines will be maximized. SUMMARY

Introduction Education is a research-based discipline (Crowl, 1996, p. 3). Presumably, this contention includes environmental education. Unfortunately, many perceive EE as either not being research basedÐ either in subject matter or theory and practiceÐ or as being based on inadequate research. Perhaps, this is because as O’Hearn (1982) once suggested, EE is notoriously dif® cult to evaluate because humansÐ the ultimate focus of EE researchÐ do not often, if ever, lend themselves to rigid experimental control. Perhaps it is because of far more serious reasons, including `a distinct shortage of well controlled studies reporting classroom methodologies for environmental education [and] studies which are reported are lacking in methodological rigor and their validity is questionable’ (Lewis, 1981/82, p. 15); `lack of thoroughness and rigor’ (Stewart, 1982, p. 42); mitigation of the utility of ® ndings `by problems of experimental design and data analysis’ (Leeming et al., 1993, p. 20); and/or because it is `frequently found 1350-4622/00/010009-18 Ó

2000 Taylor & Francis Ltd

10

N. J. Smith-Sebasto

to be based on unsatisfactory samples, idiosyncratic in character and lacking in reliability and validity’ (Naish, 1993, p. 64). Education research is commonly conceptualized as including two methods of inquiry (Krathwohl, 1993; Ary et al., 1996; Bordens & Abbott, 1996; Fraenkel & Wallen, 1996; Gall et al., 1996): qualitative, which is a `generic term for a variety of approaches of educational research and evaluation variously labeled as ethnography, naturalistic inquiry, case studies, ® eldwork, ® eld studies, and participant observation’ (Ary et al., 1996, p. 475) and quantitative, which is `grounded in the assumption that features of the social environment constitute an independent reality and are relatively constant across time and settings’ (Gall et al., 1996, p. 28). Qualitative inquiry is multimethod in its focus, involving an interpretive, naturalistic approach to its subject matter. [R]esearchers study things in their natural settings, attempting to make sense of, or interpret, phenomena in terms of the meanings people bring to them (Denzin & Lincoln, 1994, p. 2) while quantitative inquiry is multi-method in its focus, involving a causal-comparative approach to its subject matter. Researchers employ `operational de® nitions to generate numerical data to answer predetermined hypothesis or questions’ (Ary et al., 1996, p. 573). The terms qualitative research and postpositivist research are `virtually synonymous’, as are the terms quantitative research and positivist research (Gall et al., 1996, p. 28). Reports of EE research most commonly describe efforts using quantitative methods of inquiry (Marcinkowski, 1993, p. 56); however, reports of efforts using qualitative methods, while uncommon, are not completely absent. This may be because, unlike quantitative research reports, the reporting of qualitative research ® ndings/results is dif® cult since there is no `well-codi® ed, generally accepted protocol available as to how the methodology and ® ndings’ should be reported (Kna¯ & Howard, 1984, p. 17). Qualitative research should, nevertheless, `be evaluated on the same overall basis as other research, that is, according to whether it makes a substantive contribution to empirical knowledge and/or advances theory’ (Ambert et al., 1995, p. 883). A liability of EE research is the absence of guidelines for what will be considered acceptable research, regardless of method of inquiry, that are generally agreed upon by those conducting the research and reporting the ® ndings/results and those who are reviewing and evaluating reports of research for publication in the literature of the ® eld. The purpose of this report is to introduce a set of potential guidelines [1] for conducting and reporting qualitative EE research that were developed by 30 attendees participating in a 10-hour (1-day) workshop on EE research sponsored by the North American Commission on Environmental Education Research (NACEER) during the 1997 annual conference of the North American Association for Environmental Education (NAAEE). It is my hope that by sharing the results of the efforts of the workshop participants with the broader EE research community, opportunities for feedback from individuals not able to participate in the workshop, but who, nevertheless, have much to contribute in the quest to identify guidelines for conducting and reporting qualitative EE research will be maximized.

Conducting and Reporting EE Research

11

Background At the 1990 annual conference of the NAAEE, a symposium organized by the NACEER titled `Contesting Paradigms in Environmental Education Research’ was offered (Simmons et al., 1990, p. 79). The outcome of the symposium was documented in the publication Alternative Paradigms in Environmental Education Research (Mrazek, 1993). As a result of the widely acknowledged contribution of both the symposium and the monograph to the EE ® eld, some members of the NACEER felt the need to extend the visibility of research offerings at the NAAEE annual conference. There was some question about what the NACEER could offer that would be valued by conference attendees. A possible answer was identi® ed during several informal conversations at the 22nd annual conference of the NAAEE in 1993 and again during formal meetings of the NACEER at the 24th annual meeting of the NAAEE in 1995. The request was for workshops on how to conduct EE research. This led Rick Mrazek, then the Canadian Co-Chair of the NACEER, and I to propose a pre-conference workshop titled `EE Research: Guidelines for Excellence’ for the 26th annual conference of the NAAEE in 1997. The description of the workshop was: EE research has been criticized for lack of rigor and educational merit of reported ® ndings or results. This workshop will introduce research methods and designs and provide guidelines intended to improve the status of EE research and reporting of ® ndings. Our intention was to address Volk’s (1990) question `Why aren’ t we teaching ¼ the basics of research?’ by providing workshop participants with guidelines for conducting educational research provided in an assortment of appropriate textbooks used at various post-secondary institutions as well the guidelines for editorial review provided by the predominant academic publications dedicated to EE. Additional motivation for the workshop was to attempt to provide training that might address the history of sustained criticisms of the quality of EE research being conducted and reported (see, for example, Lewis, 1981/82; Stewart, 1982; Iozzi, 1989; Marcinkowski, 1988/89; Gigliotti, 1990; Lemming et al., 1993; Naish, 1993) as well as the criticisms of EE as a ® eld, which may arguably be related to the quality of the research that is presumably the `base’ for the ® eld (see, for example, Adler, 1993a,b; Kwong, 1995). An electronic literature search of the ERIC, Periodicals Contents Index, PSCY Info, Reader’s Guide Abstracts (Wilson), and Social Science Abstracts (Wilson) databases was executed using keywords such as `guidelines’ , `check list’, `evaluation’, and `educational research’. Additionally, a search of the University of Illinois library system, the third largest academic library with regard to total holdings in the world, was conducted for textbooks on educational research. Appropriate titles were reviewed for the presence of speci® c guidelines for conducting and reporting education research. The resources that comprised the basis of the workshop were (see Appendix): Textbooks FRAENKEL , J.R. & WALLEN, N.E. (1996) How to Design and Evaluate Research in Education, 3rd edn (New York, McGraw-Hill).

12

N. J. Smith-Sebasto

GALL, M.D., BORG, W.R. & GALL, J.P. (1996) Educational Research: an introduction, 6th edn (White Plains, NY, Longman). Journal Articles FARQUHAR, W.W. & KRUMBOLTZ, J.D. (1959) A check list for evaluating experimental research in psychology and education, Journal of Educational Research, 52(9), pp. 353± 354. Journal Editor’s Guidelines Environmental Education Research. Journal of Environmental Education. International Research in Geographical and Environmental Education. The workshop began with a presentation (using an overhead transparency) with several de® nitions of research found in contemporary educational research textbooks offered as an advance organizer (Fig. 1). Next, an overhead addressing the question `Why conduct research?’ was offered with several answers as found, again, in educational research textbooks (Fig. 2). The third introductory aspect of the workshop was to offer an overhead transparency of a table comparing and contrasting qualitative and quantitative methods of inquiry (Fig. 3) modeled after one offered by Marcinkowski (1993, p. 43). Soon after the workshop began and the initial presentation was completed, one participant suggested that offering guidelines for EE research was premature, even irresponsible, in light of the fact that the ® eld has not satisfactorily answered the question `What is research [vis-aÁ-vis EE research]?’ Additionally, some workshop participants seemed more concerned about arguing the superiority of one method of inquiry (more accurately, the rejection of one method of inquiry) over the other. They seemed, in a sense, to be more interested in arguing that the way EE research is conducted is based on a ¯ awed mythology, much as the gorilla in Quinn’s Ishmael (1992) argued that human development is based on a ¯ awed mythology. This position, I believed, was not going to advance the ® eld. It is simply not an `either/or’ condition. Instead of dwelling What is Research? `The formal, systematicapplication of scholarship,disciplined inquiry, and most often the scienti® c method to the study of problems.’ Ð Fraenkel & Wallen `The principal method for acquiring knowledge and uncovering the causes for behavior.’ Ð Bordens & Abbott `Systematic, controlled, empirical and critical investigation.’ Ð

Kerlinger

`The application of the scienti® c approach to the study of a problem.’ Ð

Ary, Jacobs & Razavieh

FIG. 1. De® nitions of research found in many contemporary textbooks.

Conducting and Reporting EE Research

13

Why conduct research? h To explore · to become familiar with phenomena · to gain new insights · to formulate a more speci® c research problem or research hypothesis h To describe · to portray accurately the incidence, distribution, and characteristics of a group or situation h To explain and/or predict · to investigate relationships between variables h To control · to test hypotheses of causal relationships between variables FIG. 2. Possible answers to the question: Why conduct research?

on the differences in the methods of inquiry, I believed it would be far more productive to explore how the complementary methods of inquiry can be employed collaboratively and cooperatively to more completely explore the issues facing EE as well as society’s challenge to develop in an ecologically sustainable manner. The debate escalated in tension to the point where another participant stood and demanded that the workshop be offered as it was advertised, and for what he paid the extra fee, and that anyone not similarly interested should not interfere with others attempting to get that for which they paid and registered. A total of 52 individuals registered for the pre-conference workshop thinking it was only 4 hours, rather than 10 hours due to an error in the advertisement of the time and dates for the workshop. This error led to several of the participants being able to participate in only one-half of the workshop. While some of the participants, therefore, knew that they would not obtain the information they Qualitative

Quantitative

Presents ® ndings in narrative form Derived from the social sciences Assumes that multiple realities may be constructed through social processes Truth consists of a complex of valueladen observations and interpretations. Seeks to establish understanding of social phenomena from participant’s perspectives Flexible procedures allow the questions and design to emerge of develop during the study Ethnographic, historical Researcher becomes immersed in the setting, and tends to rely on disciplined subjectivity in order to cultivate understanding during and as part of data collection Validity and reliability of the data are related to the measurement device(s) Usually employs some form of content analysis Generalizations are usually limited to the speci® c settings and conditions of the study

Presents ® ndings in numerical form Derived from natural and physical sciences Assumes that there are social facts with a single objective reality apart from individual’ s beliefs Truth consists of observable and veri® able (or objective) facts, and not of internal conditions Seeks to establish patterns of relationships between, and causes of social phenomena In¯ exible procedures are determined prior to beginning the study and are adhered to strictly Survey, correlation, quasi-experimental, experimental Researcher remains detached from the setting to avoid bias, and tend to rely upon instruments as an intermediary device for data collection Validity and reliability of the data are related to the data sources, i.e. the respondents Usually employs parametric and nonparametric statistics Efforts are often made to generalize beyond the speci® c settings of the study

FIG. 3. Comparison of qualitative and quantitative methods of inquiry.

14

N. J. Smith-Sebasto

sought, most still agreed with him, but also felt that exploring the creation of unique guidelines for EE research would be productive. It was agreed that after a break for lunch, the remaining group would reconvene and attempt to articulate the guidelines. It was decided that identifying guidelines for conducting and reporting research based on qualitative methods of inquiry would be attempted ® rst. The actual technique used to identify the guidelines may best be described as an informal hybridization of the Q-sort and delphi techniques. That is, participants sorted a set of guidelines provided to them into ranked categories to express their opinion (the Q-sort technique) as well as providing group-judgement answers to the question of what the guidelines should be (the delphi technique) as part of the workshop exercises (cf. Krathwohl, 1993, p. 555). After an exhausting afternoon and following morning, the following guidelines were crafted. To the credit of the workshop participants, the initial dif® culties were overcome and virtually all participants indicated they had participated in a very rewarding exercise. In fact, I was approached by many of the participants at various times during the remainder of the conference to be told just how productive they felt the workshop had been. Many indicated that had time permitted, they would have been willing, after a good night’s sleep, to return and address identifying guidelines for conducting and reporting EE research using quantitative methods of inquiry. Such a workshop, albeit with different participants, was conducted during the 28th annual conference of the NAAEE in 1998, again sponsored by the NACEER. The participants in the workshop believed the identi® cation of these guidelines were particularly important because efforts to provide comparable guidelines in the most popular educational research textbooks represented little more than attempts to modify guidelines for quantitative research in order for them to inform qualitative research efforts without any real effort to recognize the subtle and not-so-subtle differences in the two paradigms. Additionally, the participants felt that EE research is unique for a variety of reasons and, therefore, should have its own research guidelines that would likely be different from generic educational research guidelines. Guidelines The guidelines for conducting and reporting qualitative EE research are divided into ® ve sections (see Table 1). Each section includes questions intended to guide a researcher in designing a study and composing the report of its ® ndings. The guidelines may also prove valuable for members of editorial review boards of the various journals devoted to reporting EE research and students just beginning their careers as researchers. In order to avoid the impression of judgement, the guidelines are presented in the second person. Introduction 1. Have you adequately described how the research problems, procedures, or outcomes were shaped by your institutional af® liations, beliefs, values, or theoretical orientation? The thinking of the participants regarding this question was that every researcher, regardless of intention, is in¯ uenced by the particular circumstances of

Conducting and Reporting EE Research

15

TABLE 1. Potential guidelines for conducting and reporting EE research using qualitative methods of inquiry Introduction 1. Have you adequately described how the research problems, procedures or outcomes were shaped by your institutional af® liations, beliefs, values, or theoretical orientation? 2. Is the literature cited relevant and suf® ciently representative? Research procedures 3. Was the choice of cases or research participants appropriate for the intended research? 4. Was there suf® cient richness of data collection? 5. Have you adequately ensured credibility and trustworthiness of the data? 6. Is the context of the research (e.g. historical, cultural, etc.) adequately described? 7. Have you adequately described the rationale for the selection of the speci® c research procedure(s)? 8. Have you adequately addressed the ethical considerations related to the research purpose(s)? Research outcomes 9. Did you provide suf® ciently detailed information about the data to permit interpretation? 10. Have you adequately described the relationship between the data and the outcome(s)? 11. Have you adequately described the evolving relationship between the emerging research question(s) and the data? 12. Were the appropriate data analysis techniques used and were they described adequately? Discussion 13. Have you offered a reasonable interpretation of the outcome(s)? 14. Did you draw reasonable implications of the outcome(s) for the research participants or other audiences? General 15. Did you demonstrate consistency between Introductory Section 1 and the research design, data collection, and interpretation?

her/his employment, beliefs, values, and/or theoretical orientation. To presume that any researcher is completely devoid of emotion and blindly objective was considered an unrealistic presumption by the participants. Rather than be deluded by such unrealistic expectations, the participants agreed that a researcher should acknowledge her/his personal characteristics and divulge them openly and honestly. In this manner, a consumer of the research might be assured of understanding, in a more scholarly fashion, the agenda or worldview that may have guided the conceptualization, execution, and reporting of the research and, thereby, be better positioned to judge for her/himself the extent to which said characteristics or circumstances may have in¯ uenced the research protocol and/or reporting of the research ® ndings.

2. Is the literature cited relevant and suf® ciently representative? Regardless of the research paradigm employed or the focus of the research, participants agreed that a researcher should demonstrate a considerable familiarity with the literature relevant to the focus of the investigation. Participants also agreed that the literature review should be free of subjective bias and truly representative of the breath of scholarly reporting. Speci® cally, they felt representation of the literature describing both qualitative and quantitative research ® ndings/results, to the extent that they exist for the speci® c research focus, is

16

N. J. Smith-Sebasto

important; especially since, they felt, reports of ® ndings from qualitative research efforts are often overlooked or, worse, ignored. Research Procedures 1. Was the choice of cases or research participants appropriatefor the intended research? As with quantitative EE research efforts, where considerations of the appropriateness of either a sample or population are important, if not essential, participants felt that the selection of cases or research participants for qualitative studies are similarly important. Workshop participants felt that a researcher must be able to present a rationale for the selection of cases or research participants regarding their potential to provide data that are relevant to the research question(s). 2. Was there suf® cient richness of data collection? Workshop participants felt that a researcher must demonstrate that data collection techniques maximized the opportunities for comprehensive exploration of the research questions. In-depth and follow-up questions designed to delve into the underlying and sometimes obscured origins of perceptions and positions were suggested as mechanisms for increasing the depth of data collection. It has often been suggested that qualitative research designs result in a `richness’ of data not often possible with quantitative designs. Workshop participants intended this guideline to serve as a check to make sure qualitative EE research efforts produce such `rich’ data. 3. Have you adequately ensured credibility and trustworthiness of the data? This is not an original suggestion, as most published reports dealing with qualitative research recommend techniques such as triangulations, etc. 4. Is the context of the research (e.g. historical, cultural, etc.) adequately described? Workshop participants felt that, as with guideline 1, a researcher must expend real effort to consider and subsequently report the precise nature of the context of the research. As with any research effort, this description must be of suf® cient detail to permit a consumer to appreciate the precise nature and circumstances surrounding the collection of the data. 5. Have you adequately described the rationale for the selection of the speci® c research procedure(s)? In quantitative research designs, it is expected that a researcher will describe both the rationale and sequence of the speci® c research procedures to ensure that subsequent investigators may be able to understand how an why particular approaches were used. Workshop participants felt that, while the potential for replication is often considered the sine qua non of quantitative research efforts, it does not have the same importance in qualitative research efforts, given the complexity and inconsistency of the human character. Defence of the research

Conducting and Reporting EE Research

17

procedures was, nonetheless, deemed an essential aspect of conducting and reporting qualitative EE research if that research is to provide a positive contribution to the theory and practice of EE. The importance, workshop participants felt, was again connected to facilitating the interpretation of the data. Participants felt that only by having a precise understanding of the rationale for the speci® c research procedures could a consumer of the research ® ndings assess their merits and demerits. 6. Have you adequately addressed the ethical considerations related to the research purpose(s)? When people are asked to reveal how they feel about a certain issue(s), there is often an element of privacy that is exposed. Similarly, some questions are dif® cult to ask because there is often no way for a research participant to respond to them without placing her/himself in a compromising position. This is especially true when a researcher is directly involved with the data collection and when the focus of the research is human behavior, viz. the modi® cation of it. It is for precisely this reason that many institutions where research is conducted have governing panels charged with evaluating the ethical considerations of a research proposal before permission to conduct the research is granted. Workshop participants felt that a researcher should make a genuine effort to address the ethical considerations related to the research process and purpose. Research Outcomes 1. Did you provide suf® ciently detailed information about the data to permit interpretation? Workshop participants agreed that it would be counterproductive to describe the research procedures in exacting detail if the researcher then provided only a super® cial presentation of the data. Unlike quantitative inquiries, with their dependence on presentation of numerical data, which is often suf® cient to interpret the ® ndings of the study, qualitative inquiries depend on a narrative description of the data. In fact, the data are almost always in narrative form. As a result, qualitative research does not lend itself to the brevity of data presentation often found with quantitative research. Workshop participants felt that researchers must understand this characteristic of qualitative research and should be expected to present the data accordingly. 2. Have you adequately described the relationshipbetween the data and the outcome(s)? There is a temptation by some to interpret the ¯ exibility inherent in qualitative investigations with a lack of connection between the data and the outcomes. This is a misinterpretation. In order to reduce the possibility of such a misinterpretation, a researcher must be certain to clearly establish the link between the data and the outcomes. While qualitative inquiries include a dimension of ¯ exibility regarding collect data procedures, the ultimate aim of understanding a phenomenon cannot be compromised by expanding the domain of appropriate questioning beyond that which is clearly related to the outcomes.

18

N. J. Smith-Sebasto

3. Have you adequately described the evolving relationship between the emerging research question(s) and the data? Unlike quantitative methods of inquiry, which are guided by a paradigm that often assumes stasis regarding the research focus, qualitative methods of inquiry are guided by a paradigm that often rejects the notion of stasis. It recognizes, that is, that humans are ever changing, and as a result, a research effort must be suf® ciently ¯ exible to adapt to the data as they are accumulated and analyzed. Unlike quantitative inquiries where a prescribed research procedure, as identi® ed or based on pre-ordained research questions or null hypotheses, is not to be deviated from regardless of the effect on the data collection, qualitative research efforts depend on a ¯ exibility to modify and deviate from an original procedure, especially if such modi® cations will support an increase in the richness of the data or identi® cation of subsequent research questions. Workshop participants felt that a researcher should describe in detail any evolution in the relationship between emerging research questions and the data. 4. Were the appropriate data analysis techniques used and were they described adequately? A criticism the group had regarding data analysis techniques centered on their perception that many researchers try to force data obtained via qualitative methods of inquire into a method of analysis designed for data obtained via quantitative methods of inquiry. That is, statistical analysis of numerical data using inferential statistics. Workshop participants felt no attempt should be made to force data obtained via qualitative methods of inquiry to ® t the within the parameters necessary for most inferential statistics procedures. They agreed that use of verbal data should be as credible as the use of numerical data. Discussion 1. Have you offered a reasonable interpretation of the outcome(s)? Regardless of the research paradigm, investigators must not interpret the outcomes of their research beyond that which the data will support. In quantitative investigations, such interpretation is generally supported by objective statistical techniques that frequently have less margin for subjective interpretation of numerical data. This statement is not meant to imply that numerical data cannot lead to unreasonable statistical interpretation of outcomes; to be sure they can and do. It is, however, meant to suggest that it requires much more skill to interpret outcome data that is often largely narrative in form than it does to interpret numerical data. In qualitative inquiries, the potential for subjectivity interpreting outcomes, some may argue, increases. It is important, therefore, for a researcher to exercise caution when interpreting the outcome data. 2. Did you draw reasonable implications of the outcome(s) for the research participants or other audiences? As with any research, regardless of paradigm, the ® ndings are usually intended to inform practice, provide theoretical foundations for phenomena related to the ® eld, generate hypotheses about how to expand the knowledge base in the ® eld, etc. If the implications of the outcomes are presented in a fashion that does not

Conducting and Reporting EE Research

19

accomplish these types of goals, the research will likely not make a substantive contribution to the ® eld. Researchers must be certain to interpret outcomes in a manner which will advance the ® eld. A researcher must guard against allowing too much of her/himself to be expressed in the implications of the outcomes. General 1. Did you demonstrate consistency between Introductory Section 1 and the research design, data collection and interpretation? With this guideline, the workshop participants were interested in assuring that a researcher not only `talked the talk’ but `walked the walk’. That is, they were interested in assuring that a researcher demonstrated a degree of objectivism regarding how her/his institutional af® liations, beliefs, values, or theoretical orientation not only affected the study protocol, but also data collection and interpretation. It was their perception that in some published studies, a researcher would suggest adherence to the guidelines in Introductory Section 1, but fail to demonstrate such adherence when analyzing and interpreting the data. Failure to comply with this guideline, they felt, would reveal true biases that might well be masked by suggesting adherence with the guidelines in Introductory Section 1. Conclusion That the guidelines introduced herein may be imperfect is not surprising only for educational research but also the human species. People do the best with what they have. Societies have gone to war and killed each other because of differences of opinion concerning guidelines for life. EE researchers should not resort to such animosity, antagonism, or hegemony regarding methods of inquiry. This notion reminds me of something Harold Hungerford (1996) once said regarding the debate about the inferiority/superiority of one method of inquiry over another: `arguments about the ª goodnessº of alternative paradigms simply must stop. The key to using any method of inquiry rests with the validity of the research itself and not which paradigm is used’. That the previously published guidelines for conducting and reporting quantitative research are not entirely consistent with the needs of EE researchers is probably a defensible position. That the published guidelines for conducting and reporting qualitative research are also not entirely consistent with the needs of EE researchers is also probably defensible; as is the notion that some guidelines for qualitative research often represent little more than attempts to shape qualitative research in the mold of quantitative research. The workshop participants felt, nevertheless, that engaging in a dialogue would be the best method to facilitate the development of a set of guidelines that would advance the efforts of EE researchers. I doubt any EE researcher or group of researchers will ever design a perfect study or produce a perfect report of ® ndings. Such researchers must attempt to do they best they can with what they have. An agreed set of guidelines for conduct and reporting of research, regardless of the method of inquiry, that represent the best guidance currently available, andÐ this point is vitalÐ that are subject to regular review and revisions, should, however, be valuable to the

20

N. J. Smith-Sebasto

® eld. What is needed now is feedback from the EE research community at large regarding the validity of the proposed guidelines. This represents both pros and cons of EE research. On the pro side is the fact that so many participants, often from diametrically opposing points of view, academic preparation, profession, etc., participated in the workshop, attempted to put their philosophical differences aside, and were involved with the crafting of the guidelines. The con to that is that many involved in EE research were not participants in the workshop. Another pro is that the guidelines are being made public in this document for feedback from others. The con will be if others do not provide said feedback. Perhaps the most important pro is the evidence that the number of people interested in EE research, including identifying guidelines to improve its quality, is increasing. Notes on Contributor N. J. SMITH-SEBASTO is an Assistant Professor and Extension Specialist for Environmental Education at the University of Illinois at Urbana-Champaign. He is on the Board of Directors of the North American Association for Environmental Education (NAAEE). He is also the Chair of the Research Commission of the NAAEE. In addition to serving on the International Editorial Board for this journal he is a Consulting Editor for The Journal of Environmental Education. He is the 1999 recipient of the NAAEE Award for Outstanding Contributions to Research in Environmental Education. Notes [1] I wish to acknowledge that the guidelines introduced herein were developed in a workshop on EE research held during the 1997 Annual Conference of the North American Association for Environmental Education that I co-organized and which was sponsored by the North American Commission for Environmental Education Research. While I composed this article, my role in the workshop was largely that of a facilitator. The participants in the workshop were: Dr Rob Bixler, Ms Connie Botma, Dr Marceline Collins-Figueroa, Ms Carmen deChaeir, Ms Kristina Dupps, Dr Geoff Fagan, Dr Wei-Ta Fang, Dr Edgar Gonzalez-Guadiano, Ms Lara Halenda, Ms Roxine Hameister, Dr John Huckle, Dr Pat Irwin, Dr Regula Kyburz-Graber, Dr Bill McBeth, Ms Christina McDonald, Ms Jane McRae, Ms June McSwain, Dr Rick Mrazek, Ms Martha Nghidengua, Dr Kate Ottnad, Mr Randy Robinson, Dr Ian Robottom, Ms Connie Ruth, Dr Carol Saunders, Dr Danie Schrelider, Dr Bonnie Shapiro, Ms Teresa Squazzin, Mr John Thompson, Dr Eureta Janse van Rensburg.

REFERENCES ADLER, J.H. (1993a) A child’s garden of misinformation. Consumer’s Research, 76(September), pp. 11± 16. ADLER, J.H. (1993b) The greening of America’ s youth. The Wall Street Journal, 14 April, p. A14. AMBERT, A.-M., ADLER, P.A., ADLER, P. & DETZNER, D.F. (1995) Understanding and evaluating qualitative research, The Journal of Marriage and the Family, 57, pp. 879± 893. ARY, D., JACOBS, L.C. & RAZAVIEH, A. (1996) Introduction to Research in Education, 5th edn (Orlando, FL, Harcourt Brace College). BORDENS, K.S. & ABBOTT, B.B. (1996) Research Design and Methods: a process approach, 3rd edn (Mountain View, CA, May® eld). CROWL, T.K. (1996) Fundamentals of Educational Research, 2nd edn (Madison, WI, Brown & Benchmark).

Conducting and Reporting EE Research

21

DENZIN, N.K. & LINCOLN, Y.S. (1994) Introduction: entering the ® eld of qualitative research, in: DENZIN, N.K. & LINCOLN, Y.S. (Eds) Handbook of Qualitative Research (Thousand Oaks, CA, Sage). FRAENKEL, J.R. & WALLEN , N.E. (1996) How to Design and Evaluate Research in Education, 3rd edn (New York, McGraw-Hill). GALL, M.D., BORG, W.R. & GALL, J.P. (1996) Educational Research: an introduction, 6th edn (White Plains, NY, Longman). GIGLIOTTI, L.M. (1990) Environmental education: what went wrong? What can be done?, The Journal of Environmental Education, 22(1), pp. 9± 12. HUNGERFORD, H.R. (1996) Comments made during `EE: a panel discussion of its past, its present, and its future’, 25th Annual Conference of The North American Association for Environmental Education, San Francisco, CA, 1± 5 November. IOZZI, L.A. (1989) What research says to the educator. Part 1: environmental education and the affective domain, Journal of Environmental Education, 20(3), pp. 3± 9. KNAFL, K.A. & HOWARD, M.J. (1984) Interpreting and reporting qualitative research, Research in Nursing and Health, 7, pp. 17± 24. KRATHWOHL, D.R. (1993) Methods of Educational and Social Science Research: an integrated approach (White Plains, NY, Longman). KWONG, J. (1995). On coercive environmental education, Religion & Liberty, 5(2), pp. 7± 10. LEEMING , F.C., DWYER, W.O., PORTER, B.E. & COBERN, M.K. (1993) Outcome research in environmental education: a critical review, Journal of Environmental Education, 24(4), pp. 8± 21. LEWIS, G.E. (1981/82) A review of classroom methodologies for environmental education, The Journal of Environmental Education, 13(2), pp. 12± 15. MARCINKOWSKI , T. (1988/89) Commentary on `when words speak louder than actions’, Journal of Environmental Education, 20(2), pp. 3± 5. MARCINKOWSKI , T. (1993) A contextual review of the `quantitative paradigm’ in EE research, in: R. MRAZEK (Ed.) Alternative Paradigms in Environmental Education Research, pp. 29± 79 (Troy, OH, North American Association for Environmental Education). MRAZEK, R. (Ed.) (1993) Alternative Paradigms in Environmental Education Research, Monographs in Environmental Education and Environmental Studies, Vol. VIII (Troy, OH, North American Association for Environmental Education). NAISH , M. (1993) Forum introduction: `Never mind the qualityÐ feel the width’Ð how shall we judge the quality of research in geographical and environmental education?, International Research in Geographical and Environmental Education, 2(1), pp. 64± 65. O’HEARN, G.T. (1982) What is the purpose of evaluation? Journal of Environmental Education, 13(4), pp. 1± 3. QUINN , D. (1992) Ishmael: A Novel (New York, Bantam/Turner Books). SIMMONS , D.A., KNAPP, C. & YOUNG, C. (Eds) (1990) Setting the EE agenda for the `90s, selected papers from the 19th Annual Conference of the North American Association for Environmental Education (Troy, OH, North American Association for Environmental Education). STEWART, J. (1982) Empirical research reported in The Journal of Environmental Education: a critique, Journal of Environmental Education, 14(1), pp. 42± 44. VOLK, T.L. (1990) The importance of learners doing research, Environmental Communicator, September/October, p. 7.

22

N. J. Smith-Sebasto

Appendix FRAENKEL , J.R. & WALLEN, N.E. (1996) How to Design and Evaluate Research in Education, 3rd edn (New York, McGraw-Hill). 1. Type of research A. Experimental (1) Pre (2) True (3) Quasi B. Correlational C. Survey D. Interview E. Causal± -comparative F. Ethnographic 2. Justi® cation A. No mention of justi® cation B. Explicit argument made with response to worth of study. C. Worth if study is implied D. Any ethical considerations overlooked? 3. Clarity A. Focus clear? (Y or N) B. Variables clear? (1) Initially (2) Eventually (3) Never C. Is treatment in intervention studies made explicit? (no, yes, or n.a.) D. Is there a hypothesis? (1) No (2) Yes: explicitly stated (3) Yes: clearly implied 4. Are key terms de® ned? A. No B. Operationally C. Constitutively D. Clear in context of study 5. Sample A. Type (1) Random selection (2) Representative based on argument (3) Convenience (4) Volunteer (5) Can’t tell B. Was sample adequately described? (1 5 high; 5 5 low) C. Size of sample (n) 6. Internal validity A. Possible alternative explanations for outcomes obtained. (1) History

Conducting and Reporting EE Research

7.

8.

9. 10.

11. 12. 13.

23

(2) Maturation (3) Mortality (4) Selection bias/subject characteristics (5) Pretest effect (6) Regression effect (7) Instrumentation (8) Attitude of subjects B. Threats discussed and clari® ed? (Y or N) C. Was it clear that the treatment received an adequate trial? (in intervention studies) (Y or N) D. Was length of time of treatment suf® cient? (Y or N) Instrumentation A. Reliability (1) Empirical check made? (Y or N) If yes, was reliability adequate for study? B. Validity (1) Empirical check made? (Y or N) (2) If yes, type: (a) content (b) concurrent (c) construct External validity A. Discussion of population generalizability (1) Appropriate (a) Explicit reference to defensible target population (b) Appropriate caution expressed (2) Inappropriate (a) No mention of generalizability (b) Explicit reference to indefensible target population B. Discussion of ecological generalizability (1) Appropriate (a) Explicit reference to defensible settings (subject matter, materials, physical conditions, personnel, etc.) (b) Appropriate caution expressed (2) Inappropriate (a) No mention of generalizability (b) Explicit reference to indefensible settings Were results and interpretations kept distinct? (Y or N) Data analysis A. Descriptive statistics? (Y or N) (1) Correct technique? (Y or N) (2) Correct interpretation? (Y or N) B. Inferential statistics? (Y or N) (1) Correct technique? (Y or N) (2) Correct interpretation? (Y or N) Do data justify conclusions? (Y or N) Were outcomes of study educationally signi® cant? (Y or N) Relevance of citations

24

N. J. Smith-Sebasto

GALL, M.D., BORG, W.R. & GA, J.P. (1996) Educational Research: an introduction, 6th edn (White Plains, NY, Longman) 1. Introductory Section 1. Are the research problems or ® ndings unduly in¯ uenced by the researchers’ institutional af® liations, beliefs, values, or theoretical orientation? 2. Do the researchers demonstrate undue positive or negative bias in describing the subject of the study (an instructional method, program, curriculum, etc.)? 3. Is the literature review section of the report suf® ciently comprehensive? Does it include studies known to be relevant to the problem? 4. Is each variable in the study clearly de® ned? 5. Is the measure of each variable consistent with how the variable was de® ned? 6. Are hypotheses, questions, or objectives explicitly stated, and if so, are they clear? 7. Do the researchers make a convincing case that a research hypothesis, question, or objective was important to study? 2. Research Procedures 1. Did the sampling procedures produce a sample that is representative of an identi® able population or of a local population? 2. Did the researchers form subgroups to increase understanding of the phenomena being studied? 3. Is each measure in the study suf® ciently valid for its intended purpose? 4. Is each measure in the study suf® ciently reliable for its intended purpose? 5. Is each measure appropriate for the sample? 6. Were the research procedures appropriate and clearly stated so that others could replicate them if they wished? 3. Data Analysis 1. Were appropriate statistical techniques used, and were they used correctly? 4. Discussion of Findings/Results 1. Do the ® ndings or results of the data analyses support what the researchers conclude are the ® ndings of the study? 2. Did the researchers provide reasonable explanations of the ® ndings/ results? 3. Did the researchers draw reasonable implications for practice from their ® ndings/results? FARQUAHR, W.W. & KRUMBOLTZ, J.D. (1950) A check list for evaluating experimental research in psychology and education, Journal of Educational Research, 52(9), pp. 353± 354. 1. The Problem 1. Was the problem clearly de® ned? 2. Was a veri® able hypothesis formulated? 3. Was the hypothesis one which was logically deduced from some theory (or problem)?

Conducting and Reporting EE Research

25

2. The Design 1. Was the statistical design employed in the investigation appropriate to the particular experimental methods, conditions, subjects, and hypotheses under test? 2. Was the population studies clearly speci® ed? 3. Was the method, or methods, of drawing a sample from the population clearly speci® ed? 4. Was a control group(s) chosen in the same manner and from the same population as the experimental group(s)? 5. Were the various treatments (including control) assigned at random to the group(s)? 6. Did the experiment include a replication? 7. Was the level of signi® cance necessary for rejection of the null hypothesis speci® ed before the data were collected and analyzed? 3. The Procedure 1. Were the treatments and methods of collecting data described so that an independent investigator could replicate the experiment? 2. Were the size and characteristics of the sample adequately described? 3. Were the treatments administered so that extraneous sources of error were either held constant for all treatment and control groups or randomized among subjects within all groups? 4. The Analysis 1. Was the criterion of evaluation appropriate to the objectives of the study? 2. Was any evidence of the reliability of the criterion measure given for the experimental sample? 3. Were the statistical assumptions which are necessary for a valid test of the hypotheses satis® ed? 5. The Interpretation 1. Were the conclusions consistent with the obtained results? 2. Were generalizations con® ned to the population from which the sample was drawn? Environmental Education Research Review Questions: Suitability for EER 1. Are [the article’s] focus and contents appropriate? 2. Are the research methods (where appropriate) coherent? 3. Is [the article] analytical and critical? 4. Are the conclusions coherent? 5. Are the ideas transferable to other educational systems and cultures? 6. Will [the article] be understood by an international audience? Readiness for Publication 7. Can [the article] be published now with no amendments? 8. May be published following minor editorial attention? 9. Only suitable for publication following revision by the author? 10. A major re-working of the text is needed?

26

N. J. Smith-Sebasto

Contribution to the Literature 11. Important; publish as a priority. 12. Informative; publish in the normal schedule. 13. Worthy; but no particular publishing imperative. 14. Insigni® cant; should not be published. International Research in Geographical and Environmental Education Reviewer Questions: 1. Is the topic and its treatment consistent with the policy of the journal? 2. Does the content of the paper contribute in an interesting way to the development of research in the domain treated? 3. Is the nature of the problem treated described clearly? 4. Is the theoretical presentation, relevant to the problem treated? 5. Are the methods, and statistical analysis appropriate? 6. Are the results described in a clear and orderly manner? 7. Are the interpretation and discussion of results well founded? 8. Did you experience any dif® culties in comprehension because of: (a) insuf® cient information? (b) super¯ uous information? (c) poor organization of the text? (d) a lack of clarity in the tables and ® gures? (e) the author’s style? 9. Are the following acceptable: (a) the length of the paper? (b) the references? (c) the quality of the abstract? (d) the title of the paper? 10. The paper is: (a) acceptable? (b) acceptable with minor revisions? (c) acceptable with major revisions? (d) not acceptable? Journal of Environmental Education Reviewer Questions: 1. Will the manuscript make an original contribution to the literature of environmental education, interpretation, and communication? 2. Does this article advance the instruction, theory, method, and/or practice in the environmental education ® eld? 3. Is the manuscript an authoritative work, or is it speculative writing based mainly on opinion or conjecture rather than upon veri® able facts? 4. Does the author provide suf® cient documentation to support the thesis? 5. Does the author use original source material? 6. Is the article clearly written, and is the material organized in a coherent manner?