The importance of randomized clinical trials is acknowledged by most physicians, professional
organizations, and federal agencies. Clinical trials influence practice and are at the pinnacle of
the evidence pyramid, either alone or as part of meta-analyses. However, many threats to the
validity and scientific integrity of clinical trials can occur in their design, implementation,
analysis, and reporting. Examples include modifications in primary and secondary outcomes or
analytic approaches after the trial begins or after initial examination of data; manipulation and
falsification of data; incomplete reporting of all available trial outcomes and selective reporting
of important serious adverse events; and failure to report the results of trials with negative or
Current best practice to ensure scientific validity in the design and conduct of clinical trials
includes prespecification of the trial protocol, an associated statistical analytic plan, careful
documentation of any changes in the protocol, and oversight by institutional review boards (IRBs)
and data and safety monitoring committees. Best practice for reporting of clinical trials includes
emphasis on the primary outcomes; distinguishing between secondary outcomes and post hoc analyses;
careful attention to statistical issues, such as maintaining fidelity to the original statistical
analysis plan and appropriate treatment of missing data; complete reporting of adverse events; and
balanced, objective interpretation of the study findings.
Many changes have occurred in the reporting of clinical trials over the last decade and have
served to address many of these threats to validity. Trial registration, mandated since 2005 by the
International Committee of Medical Journal Editors,1 has
reduced the likelihood of suppressed trials and has helped improve the fidelity of clinical trial
analysis and reporting, such as reducing selective or altered reporting of major outcomes. The
CONSORT statement was released in 1996,2 revised in
20013 and 2010,4
with extensions that focus on noninferiority trials in 20065
and 2012,6 cluster randomized trials (2012),7 and patient-reported outcomes (2013),8 and has been widely adopted by most medical journals as a mechanism to improve
and standardize reporting of clinical trials.
In addition, more widespread availability and detailed review of trial protocols and statistical
analytic plans has helped clarify the intention of investigators at the time studies are designed,
and along with gradual increases in the amount of data sharing, have facilitated the ability of
reviewers and journal editors to assess the scientific validity of studies. Following the approval
of new drugs, the US Food and Drug Administration (FDA), which requires submission of complete data
from trials of products under consideration for approval, is making trial data more readily and
easily accessible (Erica Jefferson, deputy director of the FDA office of public affairs, written
communication, May 16, 2013). Likewise, the European Medicines Agency announced plans to provide
access to all clinical trial data sets submitted by industry in applications for new product
registration, and some pharmaceutical companies have announced plans to make data available to
appropriate groups upon request. Collectively these changes have led to a substantial improvement in
reporting, assessment, and transparency of clinical trials.
In 2005 JAMA adopted and modified a number of policies regarding conflicts of
interest, financial aspects of research, and the role of sponsors in funded research.9 These policies—which included a requirement for
independent statistical analysis by an academic biostatistician for industry-sponsored and
industry-analyzed studies—were developed during a time when several high-profile trials had
evidence of problems with data integrity, inappropriately conducted statistical analyses, and
incomplete reporting of major findings. Over time some of these policies have been modified and
strengthened10 but have been perceived by some in academia
and industry as creating barriers to publication of important trial results. Moreover, over the past
2 years, our experience has been that the conduct of additional analyses by independent academic
biostatisticians generally did not result in meaningful changes in the study results.
Accordingly, we are once again modifying one of the policies. JAMA will evaluate
and consider for publication clinical trials that are analyzed by statisticians employed by or
contracted by the study sponsor, without requiring independent statistical analysis by an academic
biostatistician. Advances over the past decade in standards of clinical trial reporting, enhanced
understanding of the threats to validity of clinical research, increasing data transparency, and our
experience support the change in policy.
We will continue to require submission of a copy of the trial protocol and statistical analytic
plan with all amendments, and we will require trials to be appropriately registered in an approved
publicly accessible database. Any changes to the protocol, analysis plan, or trial registration that
occurred after the trial began need to be explained in detail and justified with appropriate
documentation. JAMA editorial review will continue to include close examination of
these documents for discrepancies with the submitted manuscript and diligent evaluation and scrutiny
of all scientific reports.11 If concerns are raised in the
process of editorial evaluation, we reserve the right to request the entire data set from authors to
conduct our own statistical analysis. As stated in an editorial earlier this year,12 we prefer that investigators at academic institutions rather
than employees of the study sponsor be responsible for preparing the manuscripts and analyzing data
from clinical trials. Indeed, that is the practice at many leading research institutions. As with
all manuscripts, the first priority in decisions about publication will always be the integrity of
Many important changes in medical journalism have occurred over the past decade. The presentation
of information has moved from print to web to mobile, with audio and video media. The open access
movement continues to expand. Alternative metrics to measure the impact of articles and journals and
other new initiatives—such as COUNTER, ALLTrials, FundRef, ORCID, data crawlers, and
plagiarism detection software—and increasing concerns about both intellectual and financial
conflict of interest and bias are other important issues that journal editors must consider.
JAMA will continue to discuss and debate these new ideas and communicate with our
readers about changes we make in major editorial policies. Journals must evolve as the science that
is the heart and soul of journals changes.
Published Online: June 20, 2013. doi:10.1001/jama.2013.8083.
Additional Contributions: I am grateful for the many conversations I have had with
members of the editorial board, associate editors, and senior and deputy editors about our approach
to the review of randomized clinical trials and for the specific comments from Harold C. Sox, MD;
Roger J. Lewis, MD, PhD; and C. David Naylor, MD, DPhil, on earlier drafts of this editorial.
Editorials represent the opinions of the authors and JAMA and not those of the American Medical Association.
Country-Specific Mortality and Growth Failure in Infancy and Yound Children and Association With Material Stature
Use interactive graphics and maps to view and sort country-specific infant and early dhildhood mortality and growth failure data and their association with maternal
Register and get free email Table of Contents alerts, saved searches, PowerPoint downloads, CME quizzes, and more
Subscribe for full-text access to content from 1998 forward and a host of useful features
Activate your current subscription (AMA members and current subscribers)
Purchase Online Access to this article for 24 hours
Some tools below are only available to our subscribers or users with an online account.
Download citation file:
Web of Science® Times Cited: 1
Customize your page view by dragging & repositioning the boxes below.
and access these and other features:
Enter your username and email address. We'll send you a link to reset your password.
Enter your username and email address. We'll send instructions on how to reset your password to the email address we have on record.
Athens and Shibboleth are access management services that provide single sign-on to protected resources. They replace the multiple user names and passwords necessary to access subscription-based content with a single user name and password that can be entered once per session. It operates independently of a user's location or IP address. If your institution uses Athens or Shibboleth authentication, please contact your site administrator to receive your user name and password.