Improving research misconduct policies - Wiley Online Library

3 downloads 0 Views 85KB Size Report
Mar 10, 2017 - data or results such that the research is not accurately represented ... strengths so as to prevent misconduct. Two overall ... Division of Medical Ethics, New York University Langone Medical Center, New York, NY, USA. E-mail: ...
Science & Society

Improving research misconduct policies Evidence from social psychology could inform better policies to prevent misconduct in research Barbara K Redman & Arthur L Caplan

C

urrent US and international policies for dealing with misconduct in biomedical research follow a largely intuitive approach. For research funded by the Public Health Service in the USA, the definition of research misconduct is “fabrication (making up data or results and recording or reporting them), falsification (manipulating research materials, equipment or processes or changing or omitting data or results such that the research is not accurately represented in the research record), and plagiarism (the appropriation of another person’s ideas, processes, results, or words without giving appropriate credit)”. It does not include honest error or differences of opinion but “must be committed intentionally, knowingly, or recklessly; and the allegation must be proven by a preponderance of the evidence” (42 CFR 93.103, 104).

......................................................

“. . . the efficiency of research

misconduct policies in meeting scientific and social goals [. . .] remains largely unknown—as well as the causal factors that underlie misconduct.” ...................................................... However, the efficiency of research misconduct policies in meeting scientific and social goals of ensuring public health and safety, the integrity of research and the prudent expenditure of public funds remain largely unknown—as well as the causal factors that underlie misconduct. A recent systematic review by the Cochrane organization found that the evidence on which these policies are based to be incomplete and the existing evidence of low quality and not generalizable.

For more than four decades, research misconduct was thought to be extremely rare and largely committed by individual psychopathology—the proverbial bad apples—and that self-correction by the scientific community is sufficient to manage the problem. The structure of regulation in the USA reflects many of these assumptions: that the individual is solely at fault; that institutions can adequately manage allegations; and that reliance on whistleblowers to draw attention to fabrication, falsification, and plagiarism (FFP) is sufficient. Findings from social psychology However, evidence from disciplines such as social psychology suggests that it is more than just a few “pathological” individuals, who are vulnerable to questionable behavior. It also shows that psychological and social factors at the working environment can encourage or discourage research integrity and that addressing these factors is crucial to prevent misconduct. Supportive environments lower temptations to cut corners, define clear boundaries between right and wrong, encourage peer monitoring, and remind people of their ethical values. Social psychology may be therefore helpful to restructure rules and policies to identify and correct weaknesses and to support strengths so as to prevent misconduct. Two overall themes emerge from the literature on why people engage in unethical conduct. First, individuals in pursuit of their goals may adjust their behavior when faced with unethical situations, because they care about their identity and reputation [1]. Shared conceptions of morality in groups frequently anchor each individual’s internal moral compass, and group norms may affect this outcome, as well as actions to ensure

group inclusion [2], ethical or not, that benefit the group. Second, while an individual’s characteristic patterns of thought, emotion and behavior, self-regulation, and moral identity are relatively stable and associated with ethical behavior, they are malleable depending on situations and context [3]. Social psychology also shows that moral identity, licensing, disengagement, and psychological closeness are key factors that shape moral behavior. Moral identity is the degree to which being a moral person is important to an individual’s identity and it is a modest predictor of moral behavior. People’s moral identity varies in part based on the norms of their culture and situational contexts such as seeing others cheat. Those with stronger moral identity are more immune to justifying bad behavior. People will be dishonest for self-gain only to the extent that it does not threaten their moral self-image [4].

......................................................

“Social psychology may be therefore helpful to restructure rules and policies to identify and correct weaknesses and to support strengths so as to prevent misconduct.” ...................................................... Moral behavior may “license” someone to perform questionable actions without fear of losing one’s moral image [5]. Individuals can use such a license to rationalize misconduct and strategically forget moral standards, a phenomenon called “ethical amnesia”. This makes it more likely that they will act dishonestly again. Moral disengagement describes a situation in which people can do unethical things

Division of Medical Ethics, New York University Langone Medical Center, New York, NY, USA. E-mail: [email protected] DOI 10.15252/embr.201744110 | Published online 10 March 2017

ª 2017 The Authors

EMBO reports

Vol 18 | No 4 | 2017

511

EMBO reports

without distress, using various cognitive mechanisms: euphemistic labeling, advantageous comparison, displacement or diffusion of responsibility, disregarding or distorting the assessment of consequences, dehumanization, and fuzzing the attribution of blame. A focus on meeting goals at all costs can lead to rationalization of unethical behavior and can divert attention from the ethics of said behavior. This is particularly problematic in a situation of near-compulsion to rescue a failing project, which can lead to concealing negative outcomes. Disengagement represents a deactivation of selfregulatory processes and is a strong predictor of unethical behavior.

Improving research misconduct policies

facilitate someone’s propensity to morally disengage. Such a slippery slope can be prevented with vigilance to recognize the wrongness of a person’s behavior or with clear standards that are enforced with warnings and negative sanctions [8]. In summary, social psychology has shown that morality is dynamic, malleable, context-dependent and can deplete the ability to self-regulate behavior. Individuals regularly fail to resist temptations to act dishonestly, may not even recognize that a moral issue is at stake, or fail to notice unethical behavior around them if an environment tolerates or permits dishonesty [6]. Ignoring empirical findings

Environmental factors Psychological closeness with someone who behaves unethically amplifies moral disengagement. Although there is unfortunately little empirical evidence showing how it is initiated, it is important to note that most individuals frequently operate in a moral gray zone in which it is not always clear what constitutes honest and dishonest behavior. Being placed in a morally permissive environment is also sufficient to increase both cheating and moral disengagement. Increasing moral saliency by having individuals read or sign an honor code significantly reduces unethical behavior and prevents moral disengagement [6]. So does increasing self-awareness through creating the perception of being watched or supervised [7].

......................................................

“Being placed in a morally

permissive environment is also sufficient to increase both cheating and moral disengagement.” ...................................................... People may not consciously decide to do wrong. Even without realizing that they have changed their ethical standards, provoked by intense competition for resources, feeling they are in a zero-sum game, seeing others acting in unethical ways, or seeing no other viable way out of a quandary, individuals can drift into unethical behavior. This often occurs within an organization or group culture of ethical “drift”. Likewise, a series of small infractions that gradually increase over time may

512

EMBO reports Vol 18 | No 4 | 2017

None of these findings are recognized in current regulations, policies or training in the USA, or any other nation. This contributes to poor policy in several ways. It places full responsibility on the individual, without considering the context or the responsibility of the group or the institutional environment. This is particularly important when dealing with damaging effects of hypercompetition in the life sciences. Using whistleblowers as nearly the sole monitors for misconduct is likely a key factor for significant under-reporting of research misconduct. Group loyalty against a whistleblower will likely prevail and because of the strong disincentive to engage in whistleblowing, few come forward. Ethical norms are applied inconsistently. Underreporting significantly decreases the likelihood of getting caught despite the robust findings that raising monitoring levels and increasing detection rates decreases temptation to cheat [9]. These empirical findings are largely ignored in mandatory education on responsible conduct of research. For example, simply auditing the process of moral disengagement reduces individuals’ tendencies to disengage. Individuals can be educated to identify tempting situations and to understand the psychological toll of ethical dissonance. They can also be taught compensatory strategies: attention to basic scientific principles instead of becoming overwhelmed by the prevailing problem; following appropriate role models; recognizing if the information to make a decision is insufficient; and transparency by documenting and not obscuring information about decisions. But most importantly, they should

Barbara K Redman & Arthur L Caplan

be convinced that ordinary people who value their morality ought not cut corners when they are faced with an opportunity to gain from dishonest behavior. Although misconduct rules are often narrowly focused on processing complaints, they do note that institutions must foster a working environment that promotes the responsible conduct of research and discourages misconduct. But analysis of research environments along the lines pointed to by social psychology is almost non-existent. Institutions and research disciplines must establish some degree of surveillance to decrease opportunities for RM; to detect psychological and organizational conditions that would predispose scientists to act unethically; and to detect RM when it occurs. As ambiguous norms offer an opportunity for justifying unethical behavior, both institution and discipline should make standards and expectations clear (see Sidebar A). Since group norms and role models can support or suppress moral identity, monitoring these environments can be helpful. A case study Diederik Stapel, a Dutch social psychologist fabricated and falsified data and whole studies over many years and at three universities, which affected 55 publications, 10 PhD dissertations and ultimately resulted in his dismissal from Tilburg University. His case is likely the most complete investigation of a scientist found to have committed research misconduct. Stapel’s book, Derailment, documents his long story of misadventure. Although clearly open to bias, it provides a rare opportunity for a social psychologist to identify the principles that drove Stapel to conduct fraud.

......................................................

“Research communities also have responsibilities, which are not widely recognized, to self-regulate the quality of research and thereby prevent opportunities for misconduct.” ...................................................... Stapel’s view was that science had become a business in which journals wanted simple stories of high impact, while academics were facing escalating publication targets and his own research showed only

ª 2017 The Authors

Barbara K Redman & Arthur L Caplan

small effect sizes. At the same time, social psychology was characterized as having a culture of careless, selective, and uncritical handling of data. This conclusion is generally supported by Jussim and colleagues [10], who also note the powerful incentives for presenting a compelling story. Stapel’s story likely represents moral disengagement against a background of perceived necessity. Stapel was overambitious, addicted to acclaim, and felt a strong need to be in control. In order to gain attention, he went beyond practices he knew others were using—dumping outliers, experimenting until desired results were obtained, selective reporting—and began to change numbers. He eventually descended into fabricating entire studies, leaving no documentation, always working alone so he would not be detected, and always building on existing bodies of research so his fraudulent work appeared sound. He started to believe his own fabrications and could no longer clearly recall whether or not he had done the work. He could not stop, strongly suggesting ego depletion. Allegations from earlier whistleblowers had not been acted upon and coauthors failed to spot obvious fraud. His careless fabrication of numbers was eventually discovered, and Stapel was confronted by university authorities. He was unable to keep his lies straight and recognized that what he had done was wrong. He was also left wondering whether he was the only “Bad Apple” and how convenient it was for the university and the field to affirm the belief that he was. Integrating the empirical and the normative People who care about their moral values still can and do bad things, both intentionally and unintentionally. Sometimes, people’s morals degrade slowly or they are not aware of their own or others’ unethical actions [1]. People can switch their ethicality on or off, depending on the permissiveness of the environment [6], prominence of positive role models, and monitoring by one’s self and others. RM-related policies, such as managing conflicts of interest (COI), may unwittingly give them an exaggerated confidence that they are acting morally and are impervious to the effects of COI. Unfortunately, there are no direct studies of social psychological mechanisms of

ª 2017 The Authors

EMBO reports

Improving research misconduct policies

Sidebar A: Recommendations for RM policy (i)

Document institutions’ measures to create a research environment that promotes the responsible conduct of research and discourages RM. (ii) Harmonize RM regulations across funding sectors and internationally to create common standards for research. (iii) Expand the level of accountability to co-investigators, co-authors, and funders. Measures that institutions could/should implement to discourage RM (i) (ii)

(iii) (iv) (v)

Validate instruments to measure organizational research climate for internal use to identify toxic climates and devise interventions for improving research climate. Assure that the institution’s Research Integrity Officer (RIO), the person who receives allegations of RM and oversees a fair inquiry, are well trained and look beyond individual cases to identify trends and patterns. Sponsor discussions within departments and research teams about scientific practices and ethical violations to develop shared norms for research. Assure oversight of the quality of research being undertaken within the institution. Reward and support those who appropriately bring violations of research integrity to attention.

Sidebar B: Further reading Ayal S, Gino F, Barkan R, Ariely D (2015) Three principles to REVISE people’s unethical behavior. Perspect Psychol Sci 10: 738–741 Bandura A (1986) Social foundation of thought and action: a social cognitive theory. Englewood Cliffs, NJ, USA: Prentice-Hall Bazerman MH, Gino F (2012) Behavioral ethics: toward a deeper understanding of moral judgment and dishonesty. Annu Rev Law Soc Sci 8: 85–104 Buchanan A, Powell R (2016) Toward a naturalistic theory of moral progress. Ethics 126: 983–1014 Effron DA (2014) Making mountains of morality from molehills of virtue: threat causes people to overestimate their moral credentials. Pers Soc Psychol Bull 40: 972–985 Hertz SG, Krettenauer T (2016) Does moral identity effectively predict moral behavior? A metaanalysis. Rev Gen Psychol 20: 129–140 Jordan J, Leliveld MC, Tenbrunsel AE (2015) The moral self-image scale: measuring and understanding the malleability of the moral self. Front Psychol 6: 1–16 Kouchaki M, Gino F (2016) Memories of unethical actions become obfuscated over time. Proc Natl Acad Sci USA 113: 6166–6171 Levelt Committee, Noort Committee, Drenth Committee (2012) Flawed science: the fraudulent research practices of social psychologist Diederik Stapel. The Netherlands: Tilburg University https:// www.tilburguniversity.edu/upload/3ff904d7-547b-40ae-85fe-bea38e05a34a_Final%20report%20Fla wed%20Science.pdf Marusic A, Wager E, Utrobicic A, Rothstein HR, Sambunjak D (2016) Interventions to prevent misconduct and promote integrity in research and publication. Cochrane Database Syst Rev 4: 1–93 McAlister AL (2001) Moral disengagement, measurement and modification. J Peace Res 38: 87–99 Mecca JT, Medeiros KE, Giorgini V, Gibson C, Mumford MD, Connelly S, Devenport LD (2014) The influence of compensatory strategies on ethical decision making. Ethics Behav 24: 73–89 Moore C, Gino F (2013) Ethically adrift: how others pull our moral compass from true North, and how we can fix it. Res Organ Behav 33: 53–77 Ordonez LD, Welsh DT (2015) Immoral goals: how goal setting may lead to unethical behavior. Curr Opin Psychol 6: 93–96 Ruedy NE, Moore C, Gino F, Schweitzer ME (2013) The cheater’s high: the unexpected affective benefits of unethical behavior. J Pers Soc Psychol 105: 531–548 Shalvi S, Gino F, Barkan R, Ayal S (2015) Self-serving justification: doing wrong and feeling moral Curr Direct Psychol Sci 24: 125–130 Sternberg RJ (2012) A model for ethical reasoning. Rev Gen Psychol 16: 319–326 Thau S, Derfler-Rozin R, Pitesa M, Mitchell MS, Pillutla MM (2015) Unethical for the sake of the group: Risk of social exclusion and pro-group unethical behavior. J Appl Psychol 100: 98–113 Titus SL, Wells JA, Rhoades LJ (2008) Repairing research integrity. Nature 453: 980–982 Trevino LK, den Nieuwenboer NA, Kish-Gephart JJ (2014) (Un)Ethical behavior in organizations. Annu Rev Psychol 65: 635–660

research misconduct cases, but it is reasonable to expect that these mechanisms will be present in research in general and in cases

in which regulations are violated. Teaching individual researchers, students, and colleagues to recognize them and intervene in the

EMBO reports Vol 18 | No 4 | 2017

513

EMBO reports

situations that trigger or sustain them is a first step. But it is altogether incomplete, because it absolves institutions of their responsibility. Research communities also have responsibilities, which are not widely recognized, to self-regulate the quality of research and thereby prevent opportunities for misconduct.

......................................................

“Dealing effectively with

research misconduct requires policies that are based on evidence.” ...................................................... Dealing effectively with research misconduct requires policies that are based on evidence (see Sidebar A). Beyond social psychology, further evidence exists in areas such as the impact of institutional economic incentives, which, if structured primarily to appeal to self-interest, can crowd out ethical reasoning and accepted social norms. Analysis and incorporation of these bodies of evidence into policy and practice is an essential element not only to improve the quality of research and to protect scientists, but also to construct an appropriate environment in which to practice science.

514

EMBO reports Vol 18 | No 4 | 2017

Improving research misconduct policies

Since the development of formal research misconduct policies, the complexities of practicing science ethically have been framed as being entirely under the willful control of the scientists. Review of research findings from social psychology suggests that this view is oversimplified at best and likely incorrect. Scientists cannot so easily be classified as being either totally honest or dishonest but rather as individuals within disciplinary and institutional environments that are ambiguous or that may have slid into widespread corruption. Until we incorporate what is empirically known about misbehavior into managing and teaching about research misconduct, we are not doing the best we can.

Conflict of interest

3.

Pers Soc Psychol 107: 943 – 963 4.

caught does not affect unethical behavior. J 5.

Pers Soc Psychol Bull 41: 540 – 558 6.

Shu LL, Gino F, Bazerman MH (2011) Dishonest deed, clear conscience: when cheating leads to moral disengagement and motivated forgetting. Pers Soc Psychol Bull 37: 330 – 349

7.

Vincent LC, Emich KJ, Goncolo JA (2013) Stretching the moral gray zone: positive affect, moral disengagement and dishonesty. Psychol Sci 24: 595 – 599

8.

Welsh DT, Ordonez LD, Snyder D, Christian M (2015) The slippery slope: how small ethical transgressions pave the way for larger future transgressions. J Appl Psychol 100: 114 – 127

9.

Rosenbaum S, Billinger S, Stieglitz N (2014) Let’s be honest: a review of experimental evidence of honesty and truth-telling. J Econ Psychol 45: 181 – 196

cal behavior: why people who value morality

2.

Blanken I, van de Ven N, Zeelenberg M (2015) A meta-analytic review of moral licensing.

Gino F (2015) Understanding ordinary unethiact immorally. Curr Opin Behav Sci 3:

Gamiel E, Peer E (2013) Explicit risk of getting Appl Soc Psychol 43: 1281 – 1288

interest.

1.

Cohen TR, Panter AT, Turan N, Morse L, Kim Y (2014) Moral character in the workplace. J

The authors declare that they have no conflict of

References

Barbara K Redman & Arthur L Caplan

10.

Jussim L, Crawford J, Anglin S, Stevens S,

107 – 111

Duarte J (2016) Interpretations and methods:

Ellemers N, Van der Toorn J (2015) Groups

towards a more effectively self-correcting

as moral anchors. Curr Opin Psychol 6:

social psychology. J Exp Soc Psychol 66:

189 – 191

116 – 133

ª 2017 The Authors