May 9, 2019
In Fraud We Trust: Top 5 Cases of Misconduct in University Research
There’s a thin line between madness and immorality. This idea of the “mad scientist” has taken on a charming, even glorified perception in popular culture. From the campy portrayal of Nikola Tesla in the first issue of Superman, to Dr. Frankenstein, to Dr. Emmet Brown of Back to the Future, there’s no question Hollywood has softened the idea of the mad scientist. So, I will not paint the scientists involved in these five cases of research fraud as such. The immoral actions of these researchers didn’t just affect their own lives, but also the lives and careers of innocent students, patients, and colleagues. Academic fraud is not only a crime, it is a threat to the intellectual integrity upon which the evolution of knowledge rests. It also compromises the integrity of the institution, as any institution will take a blow to their reputation for allowing academic misconduct to go unnoticed under its watch. Here, you will find the top five most notorious cases of fraud in university research in only the last few years
Fraud in Psychology Research
In 2011, a Dutch psychologist named Diederik Stapel committed academic fraud in a number of publications over the course of ten years, spanning three different universities: the University of Groningen, the University of Amsterdam, and Tilburg University.
Among the dozens of studies in question, most notably, he falsified data on a study which analyzed racial stereotyping and the effects of advertisements on personal identity. The journal Science published the study, which claimed that one particular race stereotyped and discriminated against another particular race in a chaotic, messy environment, versus an organized, structured one. Stapel produced another study which claimed that the average person determined employment applicants to be more competent if they had a male voice. As a result, both studies were found to be contaminated with false, manipulated data.
Psychologists discovered Stapel’s falsified work and reported that his work did not stand up to scrutiny. Moreover, they concluded that Stapel took advantage of a loose system, under which researchers were able to work in almost total secrecy and very lightly maneuver data to reach their conclusions with little fear of being contested. A host of newspapers published Stapel’s research all over the world. He even oversaw and administered over a dozen doctoral theses; all of which have been rendered invalid, thereby compromising the integrity of former students’ degrees.
“I have failed as a scientist and a researcher. I feel ashamed for it and have great regret,” lamented Stapel to the New York Times. You can read the particulars of this fraud case here .
Duke University Cancer Research Fraud
In 2010, Dr. Anil Potti left Duke University after allegations of research fraud surfaced. The fraud came in waves. First, Dr. Potti flagrantly lied about being a Rhodes Scholar to attain hundreds of thousands of dollars in grant money from the American Cancer Society. Then, Dr. Potti was caught outright falsifying data in his research, after he discovered one of his theories for personalized cancer treatment was disproven. This theory was intended to justify clinical trials for over a hundred patients. Because it was disproven, the trials could no longer take place. Dr. Potti falsified data in order to continue with these trials and attain further funding.
Over a dozen papers that he published were retracted from various medical journals, including the New England Journal of Medicine.
Dr. Potti had been working on personalized cancer treatment he hailed as “the holy grail of cancer.” There are a lot of people whose bodies fail to respond to more traditional cancer treatments. Personalized treatments, however, offer hope because patients are exposed to treatments that are tailored to their own unique body constitution, and the type of tumors they have. Because of this, patients flocked to Duke to register for trials for these drugs. They were even told there was an 80% chance that they would find the right drug for them. The patients who partook in these trials filed a lawsuit against Duke, alleging that the institution performed ill-performed chemotherapy on participants. Patients were so excited that there was renewed hope for their cancer treatment, that they trusted Dr. Potti’s trials and drugs. Sadly, many of these cancer patients suffered from unusual side effects like blood clots and damaged joints.
Duke settled these lawsuits with the families of the patients. You can read details of the case here .
Plagiarism in Kansas
Mahesh Visvanathan and Gerald Lushington, two computer scientists from the University of Kansas, confessed to accusations of plagiarism. They copied large chunks of their research from the works of other scientists in their field. The plagiarism was so ubiquitous that even the summary statement of their presentation was lifted from another scientist’s article in a renowned journal.
Visvanathan and Lushington oversaw a program at the University of Kansas in which researchers reviewed and processed large amounts of data for DNA analysis. In this case, Visvanathan committed the plagiarism and Lushington knowingly refrained from reporting it to the university. Learn more about this case here .
Columbia University Research Misconduct
The year was 2010. Bengü Sezen was finally caught falsifying data after ten years of continuously committing fraud. Her fraudulent activity was so blatant that she even made up fake people and organizations in an effort to support her research results. Sezen was found guilty of committing over 20 acts of research misconduct, with about ten research papers recalled for redaction due to plagiarism and outright fabrication.
Sezen’s doctoral thesis was fabricated entirely in order to produce her desired results. Additionally, her misconduct greatly affected the careers of other young scientists who worked with her. These scientists dedicated a large portion of their graduate careers trying to reproduce Sezen’s desired results.
Columbia University moved to retract her Ph.D in chemistry. Sezen fled the country during her investigation. Read further details about this case here .
Penn State Fraud
In 2012, Craig Grimes ripped off the U.S. government to the tune of $3 million. He pleaded guilty to wire fraud, money laundering, and engaging in fraudulent statements to attain grant money.
Grimes bamboozled the National Institute of Health (NIH) and the National Science Foundation (NSF) into granting him $1.2 million for research on gases in blood, which helps detect disorders in infants. Sadly, it was revealed by the Attorney’s Office that Grimes never carried out this research, and instead used the majority of his granted funds for personal expenditures. In addition to that $1.2 million, Grimes also falsified information that helped him attain $1.9 million in grant money via the American Recovery and Reinvestment Act. Consequently, a federal judge ruled that Grimes spend 41 months in prison and pay back over $660,000 to Penn State, the NIH, and the NSF.
Check out the details about this case here .
Share this:
Latest articles, toxic labs and research misconduct, digital persistent identifiers and you, navigating the fly america act, featured articles.
November 21, 2023
November 16, 2023
September 25, 2023
Case Summaries
2008 and older.
Email Updates
- Mobile Site
- Staff Directory
- Advertise with Ars
Filter by topic
- Biz & IT
- Gaming & Culture
Front page layout
Lazy —
Top harvard cancer researchers accused of scientific fraud; 37 studies affected, researchers accused of manipulating data images with copy-and-paste..
Beth Mole - Jan 22, 2024 10:45 pm UTC
The Dana-Farber Cancer Institute, an affiliate of Harvard Medical School, is seeking to retract six scientific studies and correct 31 others that were published by the institute’s top researchers, including its CEO. The researchers are accused of manipulating data images with simple methods, primarily with copy-and-paste in image editing software, such as Adobe Photoshop.
Further Reading
DFCI Research Integrity Officer Barrett Rollins told The Harvard Crimson that David had contacted DFCI with allegations of data manipulation in 57 DFCI-led studies. Rollins said that the institute is "committed to a culture of accountability and integrity," and that "every inquiry about research integrity is examined fully."
The allegations are against: DFCI President and CEO Laurie Glimcher, Executive Vice President and COO William Hahn, Senior Vice President for Experimental Medicine Irene Ghobrial, and Harvard Medical School professor Kenneth Anderson.
The Wall Street Journal noted that Rollins, the integrity officer, is also a co-author on two of the studies. He told the outlet he is recused from decisions involving those studies.
Amid the institute's internal review, Rollins said the institute identified 38 studies in which DFCI researchers are primarily responsible for potential manipulation. The institute is seeking retraction of six studies and is contacting scientific publishers to correct 31 others, totaling 37 studies. The one remaining study of the 38 is still being reviewed.
Of the remaining 19 studies identified by David, three were cleared of manipulation allegations, and 16 were determined to have had the data in question collected at labs outside of DFCI. Those studies are still under investigation, Rollins told The Harvard Crimson. "Where possible, the heads of all of the other laboratories have been contacted and we will work with them to see that they correct the literature as warranted,” Rollins wrote in a statement.
Despite finding false data and manipulated images, Rollins pressed that it doesn't necessarily mean that scientific misconduct occurred, and the institute has not yet made such a determination. The "presence of image discrepancies in a paper is not evidence of an author's intent to deceive," Rollins wrote. "That conclusion can only be drawn after a careful, fact-based examination which is an integral part of our response. Our experience is that errors are often unintentional and do not rise to the level of misconduct."
The very simple methods used to manipulate the DFCI data are remarkably common among falsified scientific studies, however. Data sleuths have gotten better and better at spotting such lazy manipulations, including copied-and-pasted duplicates that are sometimes rotated and adjusted for size, brightness, and contrast. As Ars recently reported, all journals from the publisher Science now use an AI-powered tool to spot just this kind of image recycling because it is so common.
reader comments
Channel ars technica.
Loading metrics
Open Access
Policy Forum
Policy Forum articles provide a platform for health policy makers from around the world to discuss the challenges and opportunities in improving health care to their constituencies.
See all article types »
The Costs and Underappreciated Consequences of Research Misconduct: A Case Study
* E-mail: [email protected]
Affiliation Roswell Park Cancer Institute, Buffalo, New York, United States of America
- Arthur M. Michalek,
- Alan D. Hutson,
- Camille P. Wicher,
- Donald L. Trump
Published: August 17, 2010
- https://doi.org/10.1371/journal.pmed.1000318
- Reader Comments
Citation: Michalek AM, Hutson AD, Wicher CP, Trump DL (2010) The Costs and Underappreciated Consequences of Research Misconduct: A Case Study. PLoS Med 7(8): e1000318. https://doi.org/10.1371/journal.pmed.1000318
Copyright: © 2010 Michalek et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Funding: The authors received no specific funding for this article.
Competing interests: The authors have declared that no competing interests exist.
Abbreviations: AC, aggregate costs; IC, intangible costs; MC, measurable costs; ORI, Office of Research Integrity
Provenance: Not commissioned; externally peer reviewed.
Arthur M Michalek is Senior Vice President for Educational Affairs; Alan D. Hutson is Chair, Biostatistics; Camille P. Wicher is Vice President, Corporate Ethics and Research Subject Protection; Donald L Trump is President and CEO.
Summary Points
- The consequences of scientific misconduct are far-ranging and the costs associated with their investigation are substantial.
- It is possible to estimate the cost (direct and indirect) of investigating a single case of scientific misconduct.
- For a specific investigation for which costs were estimated for all phases of the review process, direct cost estimates approached US$525,000.
- For an individual country, the total costs to associated with the review of all cases of scientific misconduct, both reported and not reported to the Office of Research Integrity, are likely to be exponentially higher.
Fallout from scientific misconduct can be pervasive. From the broadest perspective, the public, current and future patients, funding agencies, and even the course of research may be adversely affected by scientific misconduct. At the local level. members of the perpetrator's laboratory, colleagues, trainees, and the financial resources and reputation of the home institution may become tainted. The costs associated with these acts are substantial. This article will present a model we have developed to estimate the monetary costs of scientific misconduct. Estimates are based on a case that occurred at our institution, the Roswell Park Cancer Institute, which is a National Cancer Institute–designated Comprehensive Cancer Center located in the United States. Our experiences will likely not be wholly representative of other institutions, but we feel could be instructional and should serve as a guide in the calculation of costs at other institutions.
Scientific misconduct is defined by the US Office of Research Integrity (ORI) as “fabrication, falsification, or plagiarism in proposing, performing, or reviewing research or in reporting research results” [1] . The misconduct must be “committed intentionally, knowingly, or recklessly, and there must be a significant departure from accepted practices” [1] .
Scientific misconduct likely dates back to the earliest days of scientific inquiry. Fanelli [2] conducted a meta-analysis of published surveys that asked scientists whether they or a colleague had ever committed scientific misconduct. Approximately 2% of respondents admitted to have committed scientific misconduct and 14% reported knowledge of such behavior by their colleagues [2] . The deleterious effects of these transgressions on the scientific knowledge base cannot be overstated. A poignant example is related by Shafer in his review of Scott Reuben's fraudulent research, which comprised 21 articles and abstracts spanning 15 years [3] . These articles focused on the long-term beneficial effects of perioperative nonsteroidal anti-inflammatory drug administration. As Shafer so eloquently stated, this misinformation “is deeply woven into many review articles, meta-analyses, lectures summaries, and the memories …” of individuals exposed to this information. The obvious questions are: can we re-educate everyone who has been swayed, consciously or unconsciously, by fraudulent research and, if so, how?
Assessing the Costs of Scientific Misconduct
The costs associated with scientific misconduct can be divided into three domains: conduct of the fraudulent research, investigation, and remediation.
Costs Associated with the Conduct of the Fraudulent Research
These costs includes all monetary investments (institute start-up funds, grant funding) made in the fabricated research as well as intangibles such as loss of productivity of the associated research group, loss of trust, the demoralization of faculty/trainees, and misdirection of the research efforts of other labs. In some cases, the institution may be required to reimburse the funding agency for costs of the fraudulent research as well as pay penalties, and in certain instances, temporarily suspend other studies during the investigation.
Investigative Costs
An aspect frequently overlooked in the discussion of misconduct costs are those directly related to the investigation. These costs vary considerably and are dependent on the nature of the incident (type of misconduct, complexity, etc.) and the associated time required to investigate. However, all investigations share similar elements that need to be considered when calculating overall costs. At our institution, in keeping with the model proposed by the ORI, allegations of misconduct proceed through three levels of review, each assuming escalating responsibilities and costs.
At our institution, allegations are initially reviewed by the Vice President for Corporate Ethics and the Dean of Educational Affairs. If the allegation is determined to have merit, an inquiry is initiated. This second level requires review by a committee appointed by the Vice President for Corporate Ethics and the Dean. Membership consists of the Vice President for Corporate Ethics, the Dean, four faculty members, and an attorney. The Inquiry Committee determines whether there is sufficient evidence of possible research misconduct to warrant an investigation. The inquiry is not intended to reach a final conclusion about whether research misconduct definitely occurred or who was responsible. That is the role of the Investigation Committee, which is appointed by the Vice President for Corporate Ethics and the Dean. Membership is broader and includes other professional expertise. Membership consists of the Vice President for Corporate Ethics, the Dean, at least two individuals from outside the unit or department of the complainant(s) who are expert in the subject matter or scientific area, a statistician, a representative from Human Resources, an attorney, and any other members deemed appropriate. The purpose of the investigation is to explore the allegations in detail, to examine the evidence in depth, and to determine specifically whether research misconduct has been committed, by whom, and to what extent.
Costs of the investigation may be divided into personnel (committee membership, witnesses, and support staff), material costs, and consultant costs. The most expensive component of any investigation is faculty time. Faculty members engaged in our reviews are usually associate or full professors. Faculty members on investigation committees spend considerable time both in and out of the formal committee meetings. Time spent outside formal meetings is directed at reviewing materials, securing additional information, reanalysis of data, writing, and other preparatory activities. Our experience is that faculty spend anywhere from three to ten times more time working outside of meetings than they do in meetings. Individual time commitments vary based on the individual's expertise as well as committee assignments. Costs associated with witnesses' time must also be considered. The number and frequency of witness interviews varies based on the complexity of the investigation. Witnesses also spend time outside of meetings preparing their testimony. Administrative support costs include secretarial and clerical time needed for transcription of recordings, photocopying, filing, and other related tasks. Most investigations will require sequestration of physical materials including all laboratory notebooks, computers, and other electronic storage devices. At times forensic computer experts are required to analyze hard drives as well as to retrieve e-mail exchanges or other documents that are still resident on the institutional server.
Remediation Costs
These costs include those necessitated by program closure. Not only are funds previously invested in the fraudulent research lost, but so too are funds currently supporting the fraudulent research. Moreover, pending grant applications may be recalled and further funding of existing grants may be delayed or lost. Loss of funding can be devastating to the honest members of the affected laboratory. A myriad of administrative decisions need to be made regarding such things as continuance of trainees (pre- and postdoctoral) and staff members from the affected lab, impact on trainee's research; and the costs of possibly phasing out bona fide research conducted by the guilty party. Other less-obvious costs include reputational damage to the institution, which may affect competitiveness of future grants as well as fundraising and, for those involved in patient care, there is potential patient harm and loss of patient trust and revenue. Institutional expenses may also include the cost of civil legal action from patients. Extrainstitutional costs may include intellectual corruption of the scientific literature, misdirection of future research, costs to journals in retracting deceptive research, and costs to revise guidelines based on fraudulent research.
A Possible Statistical Approach for Scientific Fraud Analyses
Very little research has been done to develop methods for formally modeling the cost of scientific fraud. Research in this area has been directed primarily at attempting to model the behavior of the individual scientist with respect to incentives for committing a fraudulent act. It has been our aim to develop a data-based modeling approach aimed at better understanding the factors that contribute to the overall cost of scientific fraud.
- MC = measurable costs
- IC = intangible costs
- ε = stochastic error.
Examples of some ICs would include loss of future earnings related to a line of research; reputational damage to the institution, which may affect competitiveness for future grants and contracts; negative effects on fundraising; and, for those involved in patient care, loss of patient revenue.
- x 1 = grant direct and indirect dollars returned to the funding agency,
- x 2 = institutional legal costs,
- x 3 = hourly cost of faculty time commitment to an investigation panel,
- x 4 = cost of sequestration of evidentiary materials,
- x 5 = human resource–related costs,
- x 6 = institutional start-up costs for supporting the fraudulent research,
- x 7 = Institutional Review Board–related costs for suspending and closing clinical studies,
- x 8 = Institute Animal Care and Use Committee–related costs for suspending and closing animal studies,
- x 9 = payment of penalties related to tainted research,
- x 10 = hourly costs associated with retracting published research,
- x 11 = hourly costs of specialized consultants needed for advisement to the investigation panel.
To date we have not gathered cost factor information prospectively or with any degree of precision in order to fit these types of models. Hence, our cost estimates to date amount to a “best guess” scenario, as illustrated in the next section. Ultimately, to apply this model a database will be developed from which we can examine statistically the relative contributions of each factor to the MC. Then, for example, the fitted model then may be utilized for estimating the cost of a future misconduct case in terms of resource management.
Applying This Approach to a Case
The following case study was based on an actual investigation. Cost estimates are given in US dollars.
Allegation.
An allegation of research misconduct was made against a senior scientist for enhancing and fabricating images and data contained in a federal grant application.
The allegation, in accordance with institute policy, was reviewed by the Vice President for Corporate Ethics and the Dean. A determination was made that there was sufficient credible and specific potential evidence of research misconduct to warrant an inquiry. The deliberation and data gathering to support this decision cost approximately $1,000.00.
An Inquiry Panel was convened consisting of the aforementioned membership. The Panel reviewed the grant application in question, additional information regarding more than a dozen figures in the grant, as well as e-mail correspondence between the respondent and several staff members. The panel concluded that there was sufficient evidence to support the allegation and that an investigation was warranted. Panel time required to review and discuss data to support this decision cost about $13,000.00.
At this point the respondent's laboratory equipment was sequestered as were all lab notebooks, computer hard drives, and other electronic devices. Sequestration involved members of institute security, the Information Technology department, and an outside forensic computer company. All computer and electronic devices were copied and copies supplied to the laboratory personnel so the affected lab could continue working on research other than that related to the questionable project until the investigation was completed and a decision had been reached. These actions cost an estimated $10,000.00.
Investigation.
An Investigation Committee was empanelled as described above. Over the course of ten meetings the Committee reviewed all of the questionable lab figures, primary data sources from lab books, electronic data and figures, and e-mail correspondence. The Committee also interviewed the respondent, the complainant, and other members of the laboratory in question. Given the complexity of the case the Investigation Committee was composed of eight individuals who spent well over 100 hours in meetings (∼$78,000) and an estimated 700 hours outside of committee (∼$430,000). Other related costs included transcriptionist and clerical support for photocopying, filing, scheduling, and correspondence (∼$2,500). Moreover, given that the Investigation Committee determined that there was evidence of scientific misconduct, a review of the scientist's other grant applications as well as manuscripts was undertaken. Approximately 50 person-hours were spent reviewing other grants and manuscripts (∼$4,000).
Total estimate of costs.
We estimate that the direct cost of this case approached $525,000. This includes faculty and witness salaries of about $512,000, clerical support costs of ∼$2,500, and other personnel costs (security, Information Technology, contracted forensics) of ∼$10,000. Other significant costs not factored into the above figure (indirect costs) include deliberation time of senior administrative faculty (CEO, Senior Vice Presidents for Scientific and Translational Research, Executive Vice President, Chair), loss of current grants ($283,000), withdrawal of two pending grant applications (∼$615,000) and one renewal (∼$363,000), the cost to the Institute of maintaining affected pre- and postdocs until other laboratories could be found (∼$40,000), and the cost of maintaining all the records for at least 6 years after the investigation has been completed.
The precise prevalence of scientific misconduct is unknown, owing largely to its clandestine nature as well as to the problem of underreporting. Fanelli [2] estimates occurrences between 2% (self) to 14% (others). Other sources cite the risk of misconduct as being less than 1% [4] . The costs associated with institutional investigations are quite significant. We conservatively estimate that if one were to apply our observed costs to all of the allegations of misconduct reported in the United States to the ORI ( n = 217 cases) in their last reporting year, the direct costs would exceed $110 million. We hope that our work will encourage others to add to our understandings of these costs.
Scientists are people and subject to the frailties of human nature, so we may never be able to totally eliminate scientific misconduct. However, we can prevent those cases of misconduct more related to “omission” of scientific standards rather than commission of misdeeds. How this can be achieved has not yet been determined. Most academic institutions have, like ours, undertaken a number of efforts to increase awareness through education and training, setting forth and enforcing scientific codes of conduct, providing mentorship training, auditing and monitoring procedures, and implementing procedures for reporting and investigating alleged incidents of misconduct. The ultimate effectiveness of these approaches may take time to discern. What is known, however, is that the costs of these proactive activities pale in comparison to the costs of a single case of scientific misconduct.
Author Contributions
ICMJE criteria for authorship read and met: AMM ADH CPW DLT. Agree with the manuscript's results and conclusions: AMM ADH CPW DLT. Designed the experiments/the study: AMM CPW DLT. Analyzed the data: AMM. Collected data/did experiments for the study: AMM CPW. Wrote the first draft of the paper: AMM. Contributed to the writing of the paper: AMM ADH CPW DLT.
- 1. Department of Health and Human Services (17 May 2005) 42CFR Parts 50 and 93, Public Health Service Policies on Research Misconduct; Final Rule. Federal Register.
- View Article
- Google Scholar
Research Cases for Use by the NIH Community
Theme 23 – authorship, collaborations, and mentoring (2023).
- Case 1: Transfer of a Project and Scientific Disagreement
- Case 2: Authorship or Acknowledgement of a Post-baccalaureate Trainee
- Case 3: Collaboration and Outside Activities
- Case Study Facilitator Notes
- 2023 Ethics Case Facilitator Training with Dr. David Resnik (NIEHS)
- 2023 Ethics Case Discussion Make-up Session
- Note: IC submission of ethics case completion data is due by 02/29/2024
Theme 22 – Use of Human Biospecimens and Informed Consent (2022)
- Case Study: Use of Human Biospecimens and Informed Consent
- 2022 Ethics Case Facilitator Training with Dr. David Resnik (NIEHS)
- Note: IC submission of ethics case completion data is due by 3/31/2023
Theme 21 – Science Under Pressure (2021)
- Case 1: Science Under Pressure
- Case 1 Facilitator Notes
- 2021 Ethics Case Facilitator Training with Dr. David Resnik (NIEHS)
Theme 20 – Data, Project and Lab Management, and Communication (2020)
- Case 1: Data Access, Analysis and Reporting within a Research Group
- Case 2: Postdoc Leaving NIH Lab
- Cases 1 and 2 for Facilitators
- 2020 Ethics Case Facilitator Training with Dr. David Resnik (NIEHS)
Theme 19 – Civility, Harassment and Inappropriate Conduct (2019)
- Case 1: Gender Harassment, Sexual Harassment, and Consenting Relationships
- Case 2: Freedom of Expression and Civility in the Laboratory
- Case 3: Biases in Mentoring of Fellows and Sexual Harassment
- Study Guide (2019)
Theme 18 – Implicit and Explicit Biases in the Research Setting (2018)
- Case 1: Gender Bias in Academia
- Case 2: Responsible and Equitable Mentoring of Fellows
- Case 3: Diversity and Bias – Approach to Disabilities
- Case 4: Implicit/Unconscious Biases?
Theme 17 – Socially Responsible Science (2017)
- Case 1: Deciding What Study Results to Publish and Transparency in Research Publication
- Case 2: Handling Select Agents
- Case 3: Research Competition and Reproducibility
- Case 4: Societal Aspects of the Responsible Conduct of Research
Theme 16 – Research Reproducibility (2016)
Because this important topic is both broad and provocative, with issues that could be discussed for hours, discussion leaders and participants will need to identify ways to keep the discussion on schedule. Three potential alternative approaches to this are: (a) keep discussion of the entire case concise and well-paced; (b) discuss a selected subset of the sections, labeled by Roman numerals, that are the most relevant to the particular IC and audience, and/or use only selected questions; or, (c) dedicate more than one hour to discussing this case.
- Case 1: Research Reproducibility
Theme 15 – Authorship and Collaborative Science (2015)
- Case 1: Intellectual Input, Core Facilities and Authorship
- Case 2: Authorship Disputes in Multi-Team Collaborations
- Case 3: Clinical Collaborations
- General Guidelines for Authorship Contributions
Theme 14 – Differentiating Between Honest Discourse and Research Misconduct and Introduction to Enhancing Reproducibility (2014)
- Case 1: Handling of Images and Graphs
- Case 2: A Technically Challenging Method Collides With a Hot Topic
- Case 3: Handling of Clinical Data
- Case 4: Sources of Potential Bias and Data Sharing
- Case 5: Research Reproducibility I: Sample Composition and Reproducibility
- Case 6: Research Reproducibility II: Prostate Cancer Serum Biomarker Study
Theme 13 – Data Management, Whistleblowers, and Nepotism (2013)
- Case 1: Whistleblowers
- Case 2: CLUES: Research Misconduct or Sloppy Science?
- Case 3: Data Management in Clinical Studies
- Case 4: Nepotism in the Training and Research Setting
Theme 12 – Mentoring (2012)
- Case 1: Different Mentoring Styles
- Case 2: Non-Academic Staff
- Case 3: Intellectual Property
- Case 4: Personal Relationships
Theme 11 – Authorship (2011)
- Case 1: Co-Authorship – When Changing Labs, Have You Done Enough to Be Included?
- Case 2: Criteria for Authorship and Attribution
- Case 3: Multiple Publications
- Case 4: First Authorship, Publicity, and Multiple Institutions
- NIH IRP Authorship Conflict Resolution Process
Theme 10 – Science and Social Responsibility, Continued (2010)
- Case 1: Potential Consequences of Epidemiological Studies
- Case 2: Scientific Research and the Press
- Case 3: Intellectual Property – Why Use an MTA
Theme 9 – Science and Social Responsibility – Dual Use Research (2009)
- Case 1: Streptococcus pneumoniae Membrane Pump Sequence
- Case 2: Pandemic Influenza Genomic Sequence
- Case 3: An Unusual Wrinkle to Translational Research
- Case 4: Cell-matrix Interaction and Tumor Growth & Metastasis
Theme 8 – Borrowing – Is It Plagiarism? (2008)
- Case 1: Borrowing Results
- Case 2: Borrowing Ideas
- Case 3: Borrowing English
Theme 7 – Data Management and Scientific Misconduct (2007)
- Case 1: Data Management of Computer-generated Files
- Case 2: Handling of Images and Graphs
- Case 3: Appropriate Use of Statistics
- Case 4: Appropriate Sources of Data and Decision to Publish
- Case 5: Handling of Clinical Data
Theme 6 – Ethical Ambiguities (2006)
- Ethical Ambiguities
Theme 5 – Data Management (2005)
- Figure 1 (Image File)
- Epidemiological and Clinical Data Management
- What’s in a Picture? The Temptation of Image Manipulation
- Three Retractions Published in Cell (2004)
Theme 4 – Collaborative Science (2004)
- Case 1: Basic-Clinical Collaboration
- Case 2: When Does a Collaborator Deserve Authorship
- Case 3: Equipment Sharing and Authorship
- Case 4: Assays and Authorship
- Case 5: Collaboration and Credit
- Case 6: The Statute of Limitations
Theme 3 – Mentoring (2003)
- Case 1: Different Supervising and Mentoring Styles
- Case 2: Equal Treatment of Postdoctoral Fellows Sharing of Job Ads
- Case 3: Work Hours and Schedules
- Case 4: Mentoring of Technicians
- Case 5: Intellectual Property Mentor’s Use of Fellow’s Research Proposal
- Case 6: Future Collaborations for Tenuretrack Investigators
- Case 7: Issues Related to CRADA Funding of Trainees
- Case 8: Confirmation of Lab Results Request for Secrecy
- Case 9: Handling of Personal Relationships in the Laboratory
- Case 10: Dealing with A Substance Abuse Problem in the Laboratory
- A Guide to Training and Mentoring in the Intramural Research Program at NIH
- SD Policy Updates for Mentors and Trainees (May 2002)
Theme 2 – Authorship (2002)
- Case 1: Authorship and the Role of the Absent Researcher
- Case 2: To Be or Not To Be Included
- Case 3: Student Publishes
- Case 4: Criteria for Authorship and Attribution
- http://www.stanford.edu/dept/DoR/rph/2-8.html
- http://www.responsibility.research.umich.edu/casematerialsdir.html
- http://www.apa.org/journals/amp/kurdek.html
- http://www.councilscienceeditors.org/services_ATF.shtml
- http://www.ed.gov/databases/ERIC_Digests/ed410318.html
Theme 1 – Scientific Misconduct (2001)
- Case 1: Scientific Misconduct in the Laboratory
- Case 2: Suspicions of Misconduct in Clinical Research
- Case 3: Ownership of Ideas
- A Guide to the Handling of Scientific Misconduct Allegations in the Intramural Research Program at the NIH
- Intramural Research Program Policies & Procedures for Research Misconduct Proceedings
This page was last updated on Friday, January 26, 2024
Site Search
- How to Search
- Advisory Group
- Editorial Board
- OEC Fellows
- History and Funding
- Using OEC Materials
- Collections
- Research Ethics Resources
- Ethics Projects
- Communities of Practice
- Get Involved
- Submit Content
- Open Access Membership
- Become a Partner
Topics: Research Misconduct
A guide that provides information and resources on teaching responsible conduct of research that focuses on the topic of research misconduct. Part of the Resources for Research Ethics Education collection.
What is Research Ethics
Why Teach Research Ethics
Animal Subjects
Biosecurity
Collaboration
Conflicts of Interest
Data Management
Human Subjects
Peer Review
Publication
Research Misconduct
Social Responsibility
Stem Cell Research
Whistleblowing
Descriptions of educational settings , including in the classroom, and in research contexts.
Case Studies
Other Discussion Tools
Information about the history and authors of the Resources for Research Ethics Collection
Research misconduct is defined as (Code of Federal Regulations: 42 CFR Part 93):
Complexity of Research Misconduct
Research misconduct is complex (DuBois et al., 2013):
- Specifics of misconduct as well as perceptions can vary greatly from case to case. The heterogeneous nature of research misconduct makes it difficult to capture the full essence of the act with a simple explanation.
Personality traits Stress Feelings of unfairness ... and any of many other reasons
Research Misconduct Prevention
Self-policing with Quality Research Practices Good science practices minimize the risk of misconduct. For example:
- Strict adherence to the scientific method
- Clear, detailed recordkeeping
- Meaningful and clear delineation of collaboration
- Shared understanding of authorship roles and responsibilities
- Attentive mentoring for newer members of the research environment
- Encouragement and support for asking questions and open discussion
Responding to Research Misconduct
Obligations to Act
- Scientists do not all agree on if, when, and how to report misconduct. This disagreement is even greater between scientists and administrators (Wenger et al., 1999).
- An allegation of research misconduct is one of the most serious charges that can be made against a scientist. Therefore, it is essential that a charge be sustained only if justified by documentation and other relevant evidence.
- Whether one is making the allegation or being accused of misconduct, clear documentation provides the best chance for a fair and timely resolution .
Questionable Research Misconduct
- Some aspects of conduct are too new or poorly defined to allow for a simple answer about what is appropriate. Other behaviors may stem simply from bad manners, honest errors, or differences of opinion, which may be questionable without being research misconduct.
- Impressions should be validated before making serious charges, and many apparent problems can be resolved by other means.
Dispute resolution
Many concerns are best addressed by means other than alleging research misconduct. Some institutions have formal mechanisms in place for conflict resolution, mediation, or arbitration; absent such mechanisms, finding a solution to a dispute may require some creativity.
- Conflict resolution : Often, good conflict resolution skills may be helpful or even sufficient. Deal with the problem as early as possible. Begin by defining points of agreement and then work on areas of disagreement. Emphasize the problem rather than the person. Give and ask for clear communication about what is most important to each of the interested parties.
- Mediation : A respected third party can sometimes help with mediating a dispute. The goal is to clarify issues in a way that permits the best possible agreement or compromise.
- Arbitration : When other avenues of communication have failed, then parties to a dispute might be convinced to put their cases before a mutually agreeable arbitrator for review and a binding decision.
Public Allegations
- The pace of the process for dealing with alleged misconduct can be frustrating. In such circumstances, it can be tempting to discuss the case publicly. However, placing a complex, unresolved issue into the public arena can be harmful to those directly involved and the scientific community as a whole.
- Publicity may also compromise the integrity of an ongoing inquiry and the privacy of parties to the investigation. Moreover, an attempt to circumvent the institutional process may prejudice those charged with reviewing the allegation.
Science is predicated on trust
Without confidence in the integrity of their peers, scientists would lack a foundation on which to build new work.
Self-regulation
Self-regulation and self-policing operate to ensure the legitimacy of research , and necessitate that scientists foster an environment in which responsible research is explicitly discussed and encouraged . In part, this means that scientists should be familiar with definitions of research misconduct and procedures for dealing with it , regardless of whether they will ever be party to allegations.
How frequently does research misconduct occur?
There are some indications that research misconduct occurs only rarely, although questionable research practices may be common (e.g., Kalichman and Friedman, 1992; Martinson et al., 2006). However, there are many barriers to accurately quantifying the extent of research misconduct; for example, cases may go unreported and institutions may be biased against finding misconduct. The actual rate of research misconduct could be as low as 1 in 100,000 or as high as 1 in 100 (Steneck, 2000; Steneck, 2006). Yet, in the past 25 years, many serious allegations of misconduct have been widely publicized, and some of those were borne out by subsequent investigation.
Examples of Research Misconduct
Hwang Woo-suk’s Stem Cell Research (Sang-Hun, 2009)
In 2006, Korean researcher Hwang Woo-suk was found to have fabricated a series of experiments in stem cell research. He reported creating embryonic stem cells through cloning in two Science journal articles. In addition to research misconduct, Hwang was charged with embezzlement and bioethics violations.
Bengü Sezen’s Research Misconduct (Marcus, 2010)
Bengü Sezen, a chemistry researcher at Columbia University, is notorious for being one of the worst cases of research misconduct in the chemistry community. Sezen perpetrated a massive, sustained effort to manipulate and falsify research data. Even going to the extent of creating fictitious people and organizations to back up her data. The Office of Research Integrity found Sezen guilty of 21 counts of research misconduct.
Regulations and Guidelines
Federal definition of research misconduct.
A government-wide definition of Research Misconduct was proposed by the Office of Science and Technology Policy (OSTP, 2000) and is now covered in the Code of Federal Regulations for the Public Health Service (PHS, 2006), the National Science Foundation (NSF, 2006), and other agencies as well. In all cases, research misconduct is essentially defined as: "fabrication, falsification, or plagiarism in proposing, performing, or reviewing research, or in reporting research results."
- Fabrication is making up data or results and recording or reporting them.
- Falsification is manipulating research material, equipment, or processes, or changing or omitting data or results such that the research is not accurately represented in the research record.
- Plagiarism is the appropriation of another person’s ideas, processes, results, or words without giving appropriate credit.
Minimally, for something to count as research misconduct, it must be committed intentionally, knowingly, or recklessly , and there must be a significant departure from accepted practices of the relevant research community. Not all instances of misbehavior or questionable conduct are covered under these policies, but for those practices that are covered, there are explicit steps that must be taken in the event of an allegation of misconduct.
Responsibilities
Shared responsibilities for addressing research misconduct
- Federal agencies have ultimate oversight authority for Federally-funded research
- Research institutions bear primary responsibility for the prevention and detection of research misconduct and for the phases required once research misconduct has been reported.
Phases of Response to Allegation of Research Misconduct
- Inquiry: assessment of whether the allegation has substance and if an investigation is warranted
- Investigation: formal development of a factual record, and examination of that record leading to the dismissal of the case or to a recommendation for a finding of research misconduct or other appropriate remedies
- Adjudication: recommendations are reviewed and appropriate corrective actions determined
Discussion Questions
- Define fabrication, falsification, and plagiarism.
- Give at least three examples of misconduct by researchers that would not meet the existing definitions of research misconduct. In your institution, what can be done about these types of misconduct?
- In your institution, what formal procedures or mechanisms (e.g., ombudsman, conflict resolution, arbitration, mediation) are available to help resolve disputes or questions about the responsible practice of science?
- Outline the basic steps to be followed in your institution for responding to an allegation of research misconduct.
- If you have direct evidence that someone in your institution has committed research misconduct, then to whom and how should such an allegation be made?
- If you were accused of having fabricated data that you had produced, how could you demonstrate that you really did obtain the results you reported?
Case Study 1
A graduate student, working on a project that involves extensive DNA sequencing, provides his mentor with a computer-generated sequence of a gene. The student tells his mentor that the sequence determination has involved complete analysis of both strands of the DNA molecule. Over the next several months, it is determined that not all of the sequence data reflects analysis of both DNA strands. Indeed, follow-up work by a postdoctoral in the laboratory reveals several mistakes in the sequence. The student in question admits to misleading his mentor and, following appropriate investigation, is convicted of scientific misconduct and dismissed from the graduate program. The mentor realizes that the student presented some of the erroneous data at a regional scientific meeting. Proceedings of the meeting were not published but abstracts of all of the works presented were distributed to approximately 100 meeting participants. In addition, the student, with the mentor's permission, sent the sequence by electronic mail to three other laboratories. What, if any, responsibility does the faculty mentor have with regard to disclosing the above developments? What, if anything should the mentor do about the prematurely released data? Under these circumstances, what is the potential for harm coming from this incident of scientific fraud? Who might be harmed?
Case Study 2
You are an editor for the Journal of Novel Diagnostics. You recently handled a manuscript that compared two new diagnostic tests for the detection of a genetic defect. Test 1 is marketed by Genetix, Inc., and test 2 is marketed by Probes Unlimited. The manuscript concludes that test 1 is superior in terms of reliability and accuracy. Following peer review and minor revision, you accept the paper and it appears in print. Shortly after publication, you receive a letter from the Vice President for Research at Probes Unlimited. She claims that examination of the methods section of the paper reveals that the authors used test 2 in a manner that significantly deviates from the instructions provided by Probes Unlimited. Moreover, she claims that the senior author on the paper has previously received research grants from Genetix, Inc. Is this "sloppy science" or scientific fraud. What course of action do you take?
Case Study 3
Dr. Hickory submits a grant application to a federal funding agency. When he receives the summary statement review of the grant application, he finds that it has been criticized on several grounds and that it has received a score that will prevent the application from being funded. He decides to do more experiments to generate preliminary information and indefinitely postpones resubmitting the grant application. Approximately 18 months later, Dr. Hickory is asked to serve as an ad hoc reviewer for a research grant submitted to a private foundation. The topical area of the grant is closely aligned with Dr. Hickory's area of expertise. It turns out that the principal investigator of this application, Dr. Poplar, was a member of the panel that previously reviewed Hickory's above-referenced grant. In reading the introductory section of the grant application, Dr. Hickory realizes that the structure and content of this section is strikingly similar to his previously submitted unfunded grant application. In fact there are several areas of the introduction where wording is virtually identical to his initial grant application. Moreover, several of the experiments proposed in the application to the private foundation are quite similar (but not identical) to the ones he had previously proposed. Dr. Hickory wonders what he can and should do about this situation. He comes to you for advice. What advice do you give him?
OEC Falsification, Fabrication, Plagiarism & Cheating Bibliography A bibliography of websites, articles, guidelines, and books looking at different aspects of research misconduct.
Cited Resources
- DuBois JM, Anderson EE, Chibnall J, Carroll K, Gibb T, Ogbuka C, Rubbelke T (2013): Understanding Research Misconduct: A Comparative Analysis of 120 Cases of Professional Wrongdoing. Accountability in Research 20:320–338. . http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3805450
- Kalichman MW, Friedman PJ (1992): A pilot study of biomedical trainees' perceptions concerning research ethics. Academic Medicine 67:769-775.
- Marcus A (2010): ORI comes down (hard) on Bengu Sezen, Columbia chemist accused of fraud. Retraction Watch. http://retractionwatch.com/2010/12/01/ori-comes-down-hard-on-bengu-sezen-columbia-chemist-accused-of-fraud
- Martinson BC, Anderson MS, Crain AL, de Vries R (2006): Scientists' Perceptions of Organizational Justice and Self-Reported Misbehaviors. Journal of Empirical Research on Human Research Ethics 1:51-66
- NSF (2005): Sec. 689.1 Definitions. Part 689-- Research Misconduct. Subpart A—General. Chapter VI--National Science Foundation. Title 45--Public Welfare. 45CFR689.1(a). http://www.nsf.gov/oig/resmisreg.pdf
- OSTP (2000): Federal Policy on Research Misconduct: Notification of Final Policy. Federal Register December 6, 2000 65(235):76260-76264. http://ori.hhs.gov/policies/fed_research_misconduct.shtml
- PHS (2005): Sec. 93.103 Research misconduct. Part 93-- Public Health Service Policies on Research Misconduct. Subpart A—General. Chapter I--Public Health Service, Department of Health and Human Services. Title 42--Public Health. 42CFR93.103. http://www.access.gpo.gov/nara/cfr/waisidx_05/42cfr93_05.html
- Sang-Hun C (2009): Disgraced cloning expert convicted in South Korea. Asia Pacific, New York Times. http://www.nytimes.com/2009/10/27/world/asia/27clone.html
- Steneck N (2000): Assessing the integrity of publicly funded research: A background report for the November 2000 ORI Research Conference on Research Integrity. http://ori.hhs.gov/documents/proceedings_rri.pdf
- Steneck N (2006): Fostering Integrity in Research: Definitions, Current Knowledge, and Future Directions. Science and Engineering Ethics 12:53-74.
- Wenger NS, Korenman SG, Berk R, Honghu L (1999): Reporting unethical research behavior. Evaluation Review 23:553-570.
The Resources for Research Ethics Education site was originally developed and maintained by Dr. Michael Kalichman, Director of the Research Ethics Program at the University of California San Diego. The site was transferred to the Online Ethics Center in 2021 with the permission of the author.
Related Resources
Submit Content to the OEC Donate
This material is based upon work supported by the National Science Foundation under Award No. 2055332. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.
Scientific misconduct and science ethics: a case study based approach
Affiliation.
- 1 Institute for Science, Innovation and Society (ISIS), Faculty of Science, Radboud University Nijmegen, Toernooiveld 1, 6525 ED Nijmegen, The Netherlands. [email protected]
- PMID: 16909155
- DOI: 10.1007/s11948-006-0051-6
The Schön misconduct case has been widely publicized in the media and has sparked intense discussions within and outside the scientific community about general issues of science ethics. This paper analyses the Report of the official Committee charged with the investigation in order to show that what at first seems to be a quite uncontroversial case, turns out to be an accumulation of many interesting and non-trivial questions (of both ethical and philosophical interest). In particular, the paper intends to show that daily scientific practices are structurally permeated by chronic problems; this has serious consequences for how practicing scientists assess their work in general, and scientific misconduct in particular. A philosophical approach is proposed that sees scientific method and scientific ethics as inextricably interwoven. Furthermore, the paper intends to show that the definition of co-authorship that the members of the Committee use, although perhaps clear in theory, proves highly problematic in practice and raises more questions that it answers. A final plea is made for a more self-reflecting attitude of scientists as far as the moral and methodological profile of science is concerned as a key element for improving not only their scientific achievements, but also their assessment of problematic cases.
- Education, Professional
- Ethics, Research* / education
- Nanotechnology / ethics
- Organizational Case Studies
- Peer Review, Research*
- Scientific Misconduct*
- Terminology as Topic
Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser .
Enter the email address you signed up with and we'll email you a reset link.
- We're Hiring!
- Help Center
Scientific misconduct and science ethics: a case study based approach
The Schön misconduct case has been widely publicized in the media and has sparked intense discussions within and outside the scientific community about general issues of science ethics. This paper analyses the Report of the official Committee charged with the investigation in order to show that what at first seems to be a quite uncontroversial case, turns out to be an accumulation of many interesting and non-trivial questions (of both ethical and philosophical interest). In particular, the paper intends to show that daily scientific practices are structurally permeated by chronic problems; this has serious consequences for how practicing scientists assess their work in general, and scientific misconduct in particular. A philosophical approach is proposed that sees scientific method and scientific ethics as inextricably interwoven. Furthermore, the paper intends to show that the definition of coauthorship that the members of the Committee use, although perhaps clear in theory, proves highly problematic in practice and raises more questions that it answers. A final plea is made for a more self-reflecting attitude of scientists as far as the moral and methodological profile of science is concerned as a key element for improving not only their scientific achievements, but also their assessment of problematic cases.
Related Papers
Daryl Chubin
Olga Savvina
The paper analyses the types of scientific misconduct and tries to evaluate the prevalence of these practices, based on statistics and studies in social science. It is concluded that significant scientific misconduct like fabrication, falsification and text plagiarism are not spread and occur quite rare. At the same time the real problem of scientific ethics is "grey area" or prevalence of "grey methods" in science like over interpretation of results, selective reporting, study weaknesses are not described, carelessness and incompetence and others. Despite it we have no moral rights to blame scientists using "grey methods". The author formulates the principles of the modern scientists which contain rejection of practicing fabrication, falsification and plagiarism; implementation the limitations of scientific activity and attempts to avoid "grey methods" in science. The paper also emphasizes the significance of collaboration of honest scientists.
Marijke Van Buggenhout
This deliverable DIII.1.2 is part of work package 3 in which indicators are gathered on the extent of misconduct and how institutions respond to breaches of scientific integrity. As a part of the empirical phase in PRINTEGER it contributes to our analysis of what policies and organizational responses are most likely to engender a culture of integrity in research organisations. The exploration of the incidence of misconduct is combined with the institutional response, since it is partly through this response that misconduct is made explicit or even defined. This deliverable reflects on one of the key questions in the scientific integrity debate; what is the incidence or extent of misconduct in science? This is one of the questions raised many years ago, but a clear-cut answer is not available and may be even impossible to formulate. What we know about misconduct in science has for the largest part been derived from self-report studies and rough estimations in statistics of universities, control agencies or funding bodies. Therefore, it remains difficult to conclude whether or not these estimations are correct, significant and reliable. In this deliverable, we report about our attempt to gather empirical data on breaches of integrity that have ended up in official administrative or institutional (academic) files e.g. cases which are visible in administrative procedures of research and research funding institutions or bodies for investigating misconduct cases. With this report, we do not pretend to have found a clear answer to the incidence question. We do however aim to make visible the procedural chain that is followed when a case of misconduct comes to the surface. Besides a ‘mapping’ exercise, we aim at discussing theoretical and methodological issues when it comes to gathering data relying upon official procedures. Registration practices differ greatly from one research institution to another, from one country to another. This makes comparative research in general (and between the countries involved in this deliverable) very difficult if not impossible. However, we argue that issues concerning denunciation, discovery and registration practices, whistle blowing, transparency, gaining access, confidentiality, reputational bias etc. are precisely worth a close scrutiny and must be discussed when doing research on the prevalence of misconduct in science in a European context. Indeed, in our view these aspects of the incidence question are not merely technical (or methodological) but they reveal a lot about the nature of scientific misconduct and about how scientific integrity and misconduct are intimately intertwined with daily scientific and academic practices and organization. They are mutually constitutive. Hence, there may be significant differences between disciplinary scientific practices as well as between national science systems (countries). It reveals a lot about how alleged breaches of integrity and misconduct are experienced, detected, reported, processed, registered and reacted upon. Starting from a state of the art of what has been measured in previous research, we focused on the biases that have to be taken into account when measuring the extent and incidence of misconduct. Besides a discussion on issues related to the use of official statistics, self-report studies and reputational biases we reflect on the conceptual issues embedded in the process of registration and their consequences for registering practices in administrative procedures. In a next step, we discuss the separate methodologies of the partners involved in this deliverable and the results that were obtained. This report wraps up with a concluding part, reflecting on future directions for research on this topic and the data sources that are useful to measure misconduct in science.
Journal of Bioethical Inquiry
Martin Bridgstock
ISBE Newsletter
Bob Montgomerie
Frederick Green
Science, in particular physics, is a collective enterprise; a fruit of the exquisitely social nature of human living. So it is inevitable to encounter ethical issues in natural science, since the contest of differing interests and views is perennial in its practice, indeed essential to its momentum. The crucial ethical question always hangs in the air: How is the truth best served? This is a very limited imperative for science to follow, excluding as it does most questions of meaning and valuation. For example, in science one does not normally ask: Why is the truth to be served? As one type of ethical “bound” in science, these forgone questions are properly analysed within moral philosophy. A more pragmatic bound is the degree to which ethics can persist as a reliable guide in a milieu wherein we all fall short at some time, and where the pressures of individual professional survival have become intense. In this paper we describe some ethical aspects of our own discipline of science: their cultural context and the bounds which they delineate for themselves, sometimes in transgression. We argue that the minimalist ethic espoused in science, namely loyalty to truth, is a bellwether for the much wider, more problematic, and more vital consequences of ethics – and its failure – in human relationships at large.
This deliverable is part of Work Package II of the Promoting Integrity as an Integral Dimension of Excellence in Research (PRINTEGER) research project. Titled What is integrity? Multidisciplinary Reconnaissance, Work Package II is devoted to the analytic reconnaissance of research integrity and scientific misconduct. This report contributes to this reconnaissance by conceptualizing deviance in science from a criminological perspective. In this chapter we aim at discussing deviance in science or scientific misconduct from a criminological perspective. A criminological approach focusses on the complexity of deviant behavior, as well as on the problematization of this behavior as deviance, and how this is part of the social reaction to it. In order to understand deviance in science we need to deconstruct the several dimensions that shape this paradoxical object (Pires, 1993). As criminologists we cannot look at misconduct in science as if it was a naturally given or ontological entity. On the contrary, we need to take into consideration the social processes that problematize scientific practices (behaviors) as not acceptable or deviant: it is precisely through these social processes not only that the figure of “scientific deviance or misconduct” is constructed, but also and at the same time, that practices of social reaction and control, and possibilities for (early-) intervention, emerge. Therefore, we will address both the phenomenon of so-called misconduct as well as the social reaction it calls into being. Scientific misconduct indeed refers to its classical forms, well known as FFP, meaning Fabrication, Falsification and Plagiarism (FFP). But as we will see in this contribution, scientific misconduct refers to a much broader category of researchers’ behavior when doing science.
Journal of Threatened Taxa
Neelesh Dahanukar
Science and Engineering Ethics
Carl Mitcham
RELATED PAPERS
https://meral.edu.mm/records/5888?community=um#.YO0FovkzaaE(University of Mandalay Research Journal)
Aung Zaw Myint
Patrick O'Mullan
Journal Européen des Urgences
TNU Journal of Science and Technology
Biochemical Journal
Leszek Zdunik
Disturbios Da Comunicacao Issn 2176 2724
Marizete Ceron
Flora et Vegetatio Sudano-Sambesica
Adamou Ibrahima
Clinical Gastroenterology and Hepatology
Riccardo Marmo
Materials Science and Engineering: C
Tanveer Iqbal
Optics Letters
Yves-Bernard André
Child Development
Lia Sandilos
International Journal of Gender Studies in Developing Societies
Tamsin Bradley
Current Applied Physics
Shubham sharma
Egyptian Informatics Journal
Vladmir Horacio
Revista Brasileira de Meteorologia
lazaro Fernandes
Advances in Manufacturing
Journal of Nonlinear Science
Domenico Mucci
Electrophoresis
Salvatore Fanali
Clinical Infectious Diseases
naomi aronson
LARHYSS Journal
bachir achour
Luca Cavagna
International Journal of Radiation Oncology*Biology*Physics
M. Heinrich Seegenschmiedt
Paediatric Respiratory Reviews
RELATED TOPICS
- We're Hiring!
- Help Center
- Find new research papers in:
- Health Sciences
- Earth Sciences
- Cognitive Science
- Mathematics
- Computer Science
- Academia ©2024
Scientific misconduct and science ethics: a case study based approach
- Published: September 2006
- Volume 12 , pages 533–541, ( 2006 )
Cite this article
- Luca Consoli PhD 1
1140 Accesses
17 Citations
Explore all metrics
The Schön misconduct case has been widely publicized in the media and has sparked intense discussions within and outside the scientific community about general issues of science ethics. This paper analyses the Report of the official Committee charged with the investigation in order to show that what at first seems to be a quite uncontroversial case, turns out to be an accumulation of many interesting and non-trivial questions (of both ethical and philosophical interest). In particular, the paper intends to show that daily scientific practices are structurally permeated by chronic problems; this has serious consequences for how practicing scientists assess their work in general, and scientific misconduct in particular. A philosophical approach is proposed that sees scientific method and scientific ethics as inextricably interwoven. Furthermore, the paper intends to show that the definition of co-authorship that the members of the Committee use, although perhaps clear in theory, proves highly problematic in practice and raises more questions that it answers. A final plea is made for a more self-reflecting attitude of scientists as far as the moral and methodological profile of science is concerned as a key element for improving not only their scientific achievements, but also their assessment of problematic cases.
This is a preview of subscription content, log in via an institution to check access.
Access this article
Price includes VAT (Russian Federation)
Instant access to the full article PDF.
Rent this article via DeepDyve
Institutional subscriptions
Similar content being viewed by others
The Structure of Scientific Fraud: The Relationship Between Paradigms and Misconduct
Ethical Ambiguity in Science
Of Fakes and Frauds: Can Scientific “Hoaxes” Be a Legitimate Tool of Inquiry?
Literature on this topic is vast. For a case study of how media influence the public perception of science, see Gregory J. (2003) The popularization and excommunication of Fred Hoyle’s ‘life-from-space’ theory. Public Understanding of Science 12 : 25–46. For a sociological study of the impact of technology on society and the public, an interesting perspective is offered by Ellul, J. (1967) The technological society . USA: Random House. See also Boulter D. (1999) Public perception of science and associated general issues for the scientist. Phytochemistry 50 : 1–7; B.L. Cohen, B.L. (1998) Public perception versus results of scientific risk analysis. Reliability Engineering and System. Safety 59 : 101–105.
Article Google Scholar
Lafollette, M. (1992) Stealing into print: fraud, plagiarism, and misconduct in scientific publishing . Berkeley: University of California Press.
Google Scholar
Drenth, P.J.D. (1999) Scientists at fault: causes and consequences of misconduct in science, in: European science and scientists between freedom and responsibility . Luxembourg: Office for Official Publications of the European Community.
M. Beasley et al. (2002) Report of the investigation committee on the possibility of scientific misconduct in the work of Hendrik Schön and coauthors. Lucent Technologies. Available online at the URL: http://www.lucent.com/news_events/researchreview.html. We will refer for convenience to this document from now on as “Report”.
Service, R.F. (2002) Winning Streak Brought Awe, And Then Doubt. Science 297 : 34–37.
Goss Levi, B. (2002) Bell Labs Convenes Committee to Investigate Questions of Scientific Misconduct. Physics Today .
Brumfield, G. (2002) Misconduct Finding at Bell Labs Shakes Physics Community. Nature 419 : 419–421
Chang, K. (09-26-2002) Panel Says Bell Labs Scientist Faked Discoveries”, New York Times .
Kolata, G. (09-29-2002) Assigning Blame is Fraud is found. New York Times .
Nature 429, 692 (17 June 2004); 429 , 789 (24 June 2004).
Electronic document, available atURL: http://www.ostp.gov/html/001207_3.html.
Merriam-Webster Online Dictionary, URL: http://m-w.com.
Kuhn, T.S. (1970) The Structure of Scientific Revolutions , University of Chicago Press, USA.
Recommendations of the Commission on Professional Self Regulation in Science , Deutsche Forschungsgemeinschaft. Available at URL: http://www.dfg.de.
Download references
Author information
Authors and affiliations.
Science and Society, Department of Philosophy, Institute for Science, Innovation and Society (ISIS), Faculty of Science, Radboud University Nijmegen, Toernooiveld 1, 6525 ED, Nijmegen, The Netherlands
Luca Consoli PhD ( Assistant Professor )
You can also search for this author in PubMed Google Scholar
Corresponding author
Correspondence to Luca Consoli PhD .
Rights and permissions
Reprints and permissions
About this article
Consoli, L. Scientific misconduct and science ethics: a case study based approach. SCI ENG ETHICS 12 , 533–541 (2006). https://doi.org/10.1007/s11948-006-0051-6
Download citation
Received : 15 March 2004
Revised : 19 January 2006
Accepted : 11 May 2006
Issue Date : September 2006
DOI : https://doi.org/10.1007/s11948-006-0051-6
Share this article
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
- science ethics
- methodology
- Find a journal
- Publish with us
- Track your research
IMAGES
VIDEO
COMMENTS
In this case, Visvanathan committed the plagiarism and Lushington knowingly refrained from reporting it to the university. Learn more about this case here. Columbia University Research Misconduct. The year was 2010. Bengü Sezen was finally caught falsifying data after ten years of continuously committing fraud.
The following five detailed case histories of specific cases of actual and alleged research misconduct are included in an appendix to raise key issues and impart lessons that underlie the committee's findings and recommendations without breaking up the flow of the report. In several cases, including the translational omics case at Duke University and the Goodwin case at the University of ...
This page contains cases in which administrative actions were imposed due to findings of research misconduct. The list only includes those who CURRENTLY have an imposed administrative actions against them. It does NOT include the names of individuals whose administrative actions periods have expired.
Scientific misconduct is the violation of the standard codes of scholarly conduct and ethical behavior in the publication of professional ... The case-study upon which Steinschneider's theory was based was later revealed to involve infanticide committed by the mother, with Steinschneider allegedly having ignored evidence and reports that ...
Rooting out scientific misconduct. Ivan Oransky and Barbara Redman Authors Info & Affiliations. Science. 11 Jan 2024. Vol 383, Issue 6679. p. 131. DOI: 10.1126/science.adn9352. eLetters (1) Scientific misconduct is an issue rife with controversy, from its forms and definitions to the policies that guide how allegations are handled.
144. The Dana-Farber Cancer Institute, an affiliate of Harvard Medical School, is seeking to retract six scientific studies and correct 31 others that were published by the institute's top ...
Scientific misconduct is a growing phenomenon defined as "fabrication, falsification or plagiarism in proposing, performing or reviewing research or in reporting research" according to the US Office of Research Integrity. ... A Case-Control Study: 2015: PLoS One: Falsification.
It is possible to estimate the cost (direct and indirect) of investigating a single case of scientific misconduct. For a specific investigation for which costs were estimated for all phases of the review process, direct cost estimates approached US$525,000. For an individual country, the total costs to associated with the review of all cases of ...
A case study of the work of Dr. William Fals-Stewart, a prolific and widely-respected researcher of Behavioral Couples Therapy (BCT), who faced serious allegations of research misconduct prior to his death, including data falsification, will be presented to illustrate the insidious consequences of fraudulent data on an entire field of research.
Abstract. In today's world, evil appears to be all pervading. Medical publication is no exception. Scientific misconduct in medical writing is slowly becoming a global concern, especially over the last few decades. While the occurrence of such events is certainly rare, every researcher and reader should be aware of this entity.
Case 1: Handling of Images and Graphs. Case 2: A Technically Challenging Method Collides With a Hot Topic. Case 3: Handling of Clinical Data. Case 4: Sources of Potential Bias and Data Sharing. Case 5: Research Reproducibility I: Sample Composition and Reproducibility. Case 6: Research Reproducibility II: Prostate Cancer Serum Biomarker Study.
In this paper, we explore different possible explanations for research misconduct (especially falsification and fabrication), and investigate whether they are compatible. We suggest that to explain research misconduct, we should pay attention to three factors: (1) the beliefs and desires of the misconductor, (2) contextual affordances, (3) and unconscious biases or influences. We draw on the ...
Case Study 1. A graduate student, working on a project that involves extensive DNA sequencing, provides his mentor with a computer-generated sequence of a gene. ... The student in question admits to misleading his mentor and, following appropriate investigation, is convicted of scientific misconduct and dismissed from the graduate program. The ...
Cases of scientific misconduct undermine the credibility of published results and ultimately reduce the confidence in the value of scientific research as a whole (Fang, Steen, & Casadevall, 2012).The detection of some spectacular cases of scientific misconduct (e.g., the case of Diederik Stapel; Callaway, 2011) has contributed to concerns over the validity of published results in psychology ...
For research to proceed efficiently, two aspects of scientific integrity need to be fostered. Firstly, there is the integrity of the scientific literature, which can accumulate errors due to inadvertent mistakes as well as due to deliberate falsification or fabrication of data, i.e., research misconduct. Secondly, there is the integrity of the ...
Scientific misconduct and fraud are prevailing problems in science and it threatens to undermine integrity, credibility, and objectivity in genuine research. ... One such case is that of Dutch social psychologist Diederik Stapel (1990), who fabricated more than 50 influential studies, usually "finding" things that academic liberals wanted ...
The Schön misconduct case has been widely publicized in the media and has sparked intense discussions within and outside the scientific community about general issues of science ethics. This paper analyses the Report of the official Committee charged with the investigation in order to show that what at first seems to be a quite uncontroversial ...
Fraud and misconduct are the two terminologies often used interchangeably. However, there is a gross distinction between the two. Scientific misconduct/fraud is a violation of the standard codes of scholarly conduct and ethical behavior in scientific research. Definition of fraud as defined in court is "the knowing breach of the standard of ...
Scientific misconduct and science ethics: a case study based approach. L. Consoli. Published in Science and Engineering… 1 July 2006. Philosophy. TLDR. It is intended to show that daily scientific practices are structurally permeated by chronic problems; this has serious consequences for how practicing scientists assess their work in general ...
Luca Consoli. ABSTRACT: The Schön misconduct case has been widely publicized in the media and has sparked intense discussions within and outside the scientific community about general issues of science ethics. This paper analyses the Report of the official Committee charged with the investigation in order to show that what at first seems to be ...
By opening up discussion on scientific misconduct in India through various documented cases, which helps combat the hagiographical account of scientific practices in India, the SSV became the country's 'court of last resort' in matters involving misconduct. 74 The first case it took up was against C. N. R. Rao (FRS) for the "use of ...
The details of the research are not in this context directly relevant, but very good accounts of it can be found (see for example5). 534 Science and Engineering Ethics, Volume 12, Issue 3, 2006 Scientific misconduct and science ethics: a case study based approach After five years of successful research and publications, something happened ...
Thomas Südhof says mistakes in co-authored publications are honest errors that don't affect studies' conclusions. ... a neurologist at Vanderbilt University Medical Center who has investigated several cases of scientific misconduct, says Südhof has been unusually open in discussing the errors. ... An integrity sleuth who goes by the ...
The Schön misconduct case has been widely publicized in the media and has sparked intense discussions within and outside the scientific community about general issues of science ethics. This paper analyses the Report of the official Committee charged with the investigation in order to show that what at first seems to be a quite uncontroversial case, turns out to be an accumulation of many ...