Objective To examine the degree of concordance in reporting serious adverse events (SAEs) from antidepressant and antipsychotic drug trials among journal articles and clinical trial summaries, and to categorise types of discrepancies.
Design Cross-sectional study of summaries of all antidepressant and antipsychotic trials included in an online trial registry and their first associated stand-alone journal articles.
Setting Clinicalstudyresults.org, sponsored by Pharmaceutical Research and Manufacturers of America; clinicaltrials.gov, administered by the US National Institutes of Health.
Main outcome measure 3 coders extracted data on the numbers and types of SAEs.
Results 244 trial summaries for six antidepressant and antipsychotic drugs were retrieved, 142 (58.2%) listing an associated article. Of 1608 SAEs in drug-treated participants according to trial summaries, 694 (43.2%) did not appear in associated articles. Nearly 60% of SAEs counted in articles and 41% in trial summaries had no description. Most cases of death (62.3%) and suicide (53.3%) were not reported in articles. Half or more of the 142 pairs were discordant in reporting the number (49.3%) or description (67.6%) of SAEs. These discrepancies resulted from journal articles’ (1) omission of complete SAE data, (2) reporting acute phase study results only and (3) more restrictive reporting criteria. Trial summaries with zero SAE were 2.35 (95% CI, 1.58 to 3.49; p<0.001) times more likely to be published with no discrepancy in their associated journal article. Since clinicalstudyresults.org was removed from the Internet in 2011, only 7.8% of retrieved trial summaries appear with results on clinicaltrials.gov.
Conclusions Substantial discrepancies exist in SAE data found in journal articles and registered summaries of antidepressant and antipsychotic drug trials. Two main scientific sources accessible to clinicians and researchers are limited by incomplete, ambiguous and inconsistent reporting. Access to complete and accurate data from clinical trials of drugs currently in use remains a pressing concern.
- CLINICAL PHARMACOLOGY
This is an Open Access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/
Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.
Strengths and limitations of this study
Published journal articles from antidepressant and antipsychotic drug trials report substantially fewer serious adverse events than associated clinical trial summaries posted by industry trial sponsors on a previously active online registry.
Our findings of inconsistencies and ambiguities in serious adverse event reporting in journal articles and trial summaries suggest that registries might not provide meaningfully improved access to complete and transparent clinical trial data.
The registry from which we retrieved trial summaries has since been removed from the Internet and most trial summaries were not transferred with results to clinicaltrials.gov, making our analysis a unique examination of data that has been lost or scattered.
We examined only the first stand-alone journal article associated with each trial summary, so it is possible that additional harms outcomes and longer term outcomes absent from our sample of journal articles were reported in subsequent articles. Nevertheless, clear trends of incomplete reporting were apparent between journal article and trial summary sources.
Publication bias and concerns regarding the integrity of the medical treatment knowledge base have led to various mechanisms, such as publicly accessible clinical trial registries, to promote transparent and complete reporting of clinical trial results.1 ,2 As the next most accessible source of drug information after published articles, clinical trial summaries available in online trial registries might contribute to improved evidence synthesis since they are supposed to provide an inclusive synopsis of positive and negative results.3 ,4 In this study we compare serious adverse events (SAEs) found in industry-funded antipsychotic and antidepressant drug trial summaries posted by trial sponsors on an online trial registry, with SAEs found in published journal articles reporting on the same trials.
SAEs by definition result in death, hospitalisation or significant disability and are therefore particularly important to report from a clinical trial because of their potential impacts on treatment decision-making and patient safety. International Conference on Harmonisation (ICH) guidelines state that SAEs ‘deserve special attention’ relative to other types of adverse effects, including providing individual-level patient detail and narrative for each SAE in clinical trial reports submitted to regulatory agencies.5 Regulatory agencies in the USA and across Europe require trial sponsors to immediately report unexpected or life-threatening SAEs.6 ,7 However, the extent to which SAEs are then reported in outlets for clinicians, researchers and the public is unknown, though evidence suggests incomplete and ambiguous reporting of harms-related data.8–10 Recent settlements resulting from state and federal lawsuits in the USA against pharmaceutical manufacturers for minimising or concealing drug harms, further highlight the need for increased diligence in discerning what important harm-related drug information might remain unknown or distorted in scientific outlets for reporting clinical trial results.11–13 While previous research has demonstrated that harms data are less completely reported in journal articles than clinical trial summaries, these studies provide primarily quantitative counts of reporting practices.8–10 The present analysis seeks to elaborate the nature of quantitative and qualitative differences in SAE reporting, and possible explanations for reporting discrepancies.
Antipsychotic and antidepressant drugs—which rank among the 10 highest-selling drug classes in the USA and the world14 ,15—are mainstay treatments in psychiatry and prescribed for myriad indicated and off-label, psychiatric and non-psychiatric uses.16 ,17 Journal publications, clinical trial summaries posted on trial registries and data from regulatory agencies such as the US Food and Drug Administration (FDA) currently represent the primary information sources for clinicians and decision-makers regarding the safety and effectiveness of drug treatments. In contrast to substantially lengthier accounts of trials found in clinical study reports submitted to regulatory agencies, clinical trial summaries are abbreviated, concise descriptions of trials’ backgrounds, methodologies and positive and negative results. Similar to clinical study reports, they are structured according to templates described in the ICH Guidelines for Industry: Structure and Content of Clinical Study Reports,5 though their level of detail can vary substantially. Using the clinical trial summaries for all trials of these drugs posted by industry sponsors on clinicalstudyresults.org, we aimed to (1) count and describe SAEs reported in trial summaries and, as applicable, their associated peer-reviewed journal articles, (2) assess the consistency of SAE reporting between pairs of trial summaries and associated journal articles and (3) categorise possible explanations for discrepant reporting.
Clinical trials summaries were retrieved from clinicalstudyresults.org, the former online public registry sponsored by the Pharmaceutical Research and Manufacturers of America (PhRMA). Published journal articles were identified using the bibliography listed on the cover page of each trial summary.
Clinical trial summaries
The clinicalstudyresults.org registry was established in 2005 by PhRMA as a single repository for pharmaceutical manufacturers to post result summaries of their sponsored clinical trials. At the time, the federally funded clinicaltrials.gov, established in 2000 and administered by the US National Institutes of Health, required manufacturers to register only the existence of their trials. According to PhRMA guidelines, complete results of all hypothesis-testing clinical trials completed after 2002 for products approved for marketing in the USA were to be submitted to its registry within 1 year after completion of the trial, and references to articles published in peer-reviewed journals added to the trial summary as soon as they were published.18
In May 2011, we retrieved all Phase II, III and IV clinical trial summaries (n=329) for all nine drugs within the antidepressant and antipsychotic classes listed on clinicalstudyresults.org. We excluded three drugs (desvenlafaxine, quetiapine and venlafaxine) with registered trials but no or few posted trial summaries. For the remaining six drugs (n=254 trial summaries) we retained the summaries with trial completion dates on or before 2008, allowing at least 2.5 years for a trial to reach publication in the peer-reviewed literature (see online supplementary appendix table S1). This resulted in 244 (74%) clinical trial summaries for six drugs from three manufacturers: aripiprazole (Abilify, Bristol-Myers Squibb), atomoxetine (Strattera, Eli Lilly), duloxetine (Cymbalta, Eli Lilly), olanzapine (Zyprexa, Eli Lilly), sertraline (Zoloft, Pfizer) and ziprasidone (Geodon, Pfizer). Trial summaries averaged 18 pages in length (range: 3–147). Online supplementary file 1 provides a trial summary illustrating the typical format of the documents in this sample. Trial summaries include premarketing studies that were sent to regulatory agencies for drug approval and postmarketing studies for new indications, additional outcomes and long-term follow-up.
We used the bibliography listed on the cover page of each trial summary to retrieve the earliest journal article reporting on the full trial. We emailed and telephoned the medical communications, clinical trials or customer relations department of each manufacturer of the included drugs to inquire about the completeness of the list of trial summaries and journal articles posted on clinicalstudyresults.org. No representative from any manufacturer could confirm completeness of the posted lists nor provide a current list of all clinical trials and journal publications for the respective drugs. Representatives directed us to visit clinicaltrials.gov to view current and completed trials, and PubMed for a list of publications. We then attempted to manually search PubMed to match possible additional publications with the trial summaries, but the absence of trial identification numbers in journal articles made it extremely difficult to crosscheck and match all sources reliably. These additional efforts, therefore, did not affect the final sample size.
We employed double data extraction. One coder extracted the number and exact description of SAEs reported to occur in drug-treated participants from the Results section of each trial summary and journal article. For multiphase trials, we tallied the SAEs occurring in each phase. The number of patients experiencing SAEs was counted in the few cases where the number of events was not provided, therefore underestimating the actual number of SAEs. We also extracted from each source the trial start and completion year, article publication date, study length, sample size, targeted indication and consistency of reporting SAEs (see explanation below). A second coder independently extracted these data from a 50% random sample of trial summaries and articles for three of the six drugs. A third coder repeated the same process for the other three drugs. The values obtained by the second and third coders were compared to those obtained by the first. Any discrepancies were resolved by consensus. Coding for most reports and articles was straightforward and few disagreements in recordings between coders were found.
We evaluated the consistency of the number and description of SAEs occurring in drug-treated participants reported between each trial summary and its associated article. The number of SAEs was considered inconsistent if (1) reported numbers differed between the two sources (eg, aripiprazole trial CN138–008: trial summary cited 7, journal article 6, SAEs), (2) one source reported the number of SAEs while the other contained no or an ambiguous statement about their occurrence; or (3) the journal article did not report the trial phase in which SAEs did occur according to the trial summary (eg, ziprasidone trial 1006: in a 60-week multiphase study with 8 SAEs reported in the summary, the article reports findings from the 8-week acute phase with zero SAEs). The description of SAEs was considered inconsistent if only one source described the events (eg, duloxetine trial 6091: the summary describes 1 SAE as an intentional overdose, the article omits the description but accurately reports the number), or if one source less completely described the events than the other source (eg, duloxetine trial 8601: the summary lists one death from suicide as well as other SAEs related to psychiatric worsening, but the article mentions only the suicide). Sources were considered consistent if both reported the number or description of SAEs identically, or if neither reported such information. In each instance of discrepant reporting, we performed an in-depth inductive analysis involving a careful review of the trial summary and journal article to identify a possible explanation for the discrepancy. We then grouped the emerging patterns, which resulted in three categories (described in the results section): differences in study length or phase reported, differences in reporting criteria used and apparent selective reporting of SAE data. Discrepancies were only assigned to the latter category after ruling out the other two explanations. No additional categories to explain discrepant reporting emerged from the analysis.
We used descriptive statistics to summarise quantitative variables related to study characteristics and frequencies for categorical variables. We calculated the number of SAEs per patient treated for each drug by dividing the number of SAEs reported in trial summaries and journal articles, respectively, by the total number of drug-treated participants.
We extracted exact descriptions of SAEs and then categorised them as: behavioural or cognitive, physical, no description provided and unspecified (including overdose, dependence, death or hospitalisation for unspecified reasons and accidental injury). We further counted the number of SAEs reported as death, suicide, suicide attempt, homicidal ideation and new or worsened psychiatric symptoms.
We calculated risk ratios to test the likelihood of trial summaries reporting zero SAEs to be published as stand-alone journal articles in a manner congruent with the summaries, compared to trial summaries reporting ≥1 SAEs. Risk ratios were calculated with 95% CIs and Pearson's χ2 analysis using PASW Statistics, V.18 software.19
Search results and sample selection
Using the bibliography listed on the cover page of each trial summary, we counted a total of 496 listed publications (an average of two publications per trial, with an average time to publication of 2.5 years), from which we retrieved the earliest journal article reporting on the full trial. From the total we excluded 261 (52.6%) subset analyses (ie, reports on a subset of the total sample based on a shared characteristic, such as gender), meta-analyses and conference abstracts. Of the 244 trial summaries, 72 (29.5%) listed no publication of any kind, 30 (12.3%) listed only one or more of the excluded publication types and 142 (58.2%) listed at least one associated stand-alone journal article (see figure 1). The final sample consisted of 142 trial summary-journal article pairs listed on clinicalstudyresults.org and an additional 102 trial summaries from the registry with no associated journal article.
For each of the six drugs included in this analysis, table 1 summarises trial characteristics as reported in trial summaries, their associated journal articles and the additional trial summaries having no associated journal article (referred to as unpublished trial summary on all tables and appendices). Overall, a stand-alone journal article was available for 58.2% of trials in this sample, though this varied by drug from a low of 27.6% for trials of ziprasidone to 72.9% for trials of duloxetine. Journal articles reported findings for an identical or nearly identical number of participants as their associated trial summaries. The 102 unpublished summaries, however, included data on an additional 20 084 drug-treated participants. The median study length was shorter in journal articles (11 weeks) than in their paired trial summaries (12 weeks) or unpublished trial summaries (16 weeks).
The three antipsychotic drugs (n=129 trial summaries) were being tested for the treatment of psychotic disorders (56.6% of studies), bipolar disorder or mania (26.4%), or other conditions (16.2%) such as depressive disorders, Alzheimer's, autism, alcohol dependence or borderline personality disorder. The three antidepressant drugs (n=115 trial summaries) were being studied for the treatment of attention deficit hyperactivity disorder (42.6%), depressive disorders (34.8%), anxiety disorders (8.7%) or other conditions (14%) such as pain-related disorders or post-traumatic stress disorder.
SAEs in Trial Summaries
Ninety per cent of all trial summaries (n=244) reported a precise number of SAEs occurring in the trial. The 142 trial summaries with an associated journal article reported 1608 SAEs, and the 102 trial summaries with no associated journal article reported an additional 1423 SAEs. Table 2 details the total and per patient numbers of SAEs reported in trial summaries for each drug. Online supplementary appendix table 2 lists additional SAEs for the 10 excluded trial summaries with trial completion dates in 2009 or later.
No description was provided for 41% of the SAEs cited in trial summaries (46% and 20% of SAEs in antipsychotic and antidepressant trials, respectively). An additional 11.6% of SAEs were non-specifically described, such as ‘accidental injury’ in duloxetine trial 1126. When a specific description was present, we categorised 28.4% of SAEs as behavioural or cognitive and 18.9% as physical. Table 3 details all cases of death, suicide and new or worsened psychiatric symptoms for each drug.
Serious adverse events in journal articles
Nearly 40% of journal articles failed to specify the number of SAEs that occurred in the trial (table 2), containing either no statement related to SAEs or an ambiguous statement without an actual number of SAEs, such as sertraline trial 1060: ‘no subjects had SAE related to study treatment.’ A total of 914 SAEs were reported across the 85 journal articles that did include specific data on SAE occurrence.
Most SAEs (58.9%) reported in journal articles (61% in antipsychotic and 55.5% in antidepressant trials) had no accompanying description and another 8% were non-specifically described. Nearly one-fifth (18.9%) of SAEs were behavioural or cognitive in nature and 14.6% were described as physical. Table 3 shows that one-quarter of SAEs described in journal articles were categorised as death, suicide, homicidal ideation or new or worsened psychiatric symptoms.
Consistency of reporting in trial summary-journal article pairs
Just over half (56.8%) of the 1608 SAEs experienced by drug-treated participants according to trial summaries (n=142) were also reported in associated journal articles. This proportion varied widely between the drugs, from 14.8% of SAEs in atomoxetine trials to 114.6% in aripiprazole trials (see table 2). The number of SAEs per patient for most drugs was lower in articles (0.03, range: 0.003–0.07) than in associated summaries (0.05, range: 0.02–0.13). Trial summaries with no associated article averaged the highest number of SAEs per patient (0.07, range: 0.01–0.14).
Half or more of the 142 trial summary-journal article pairs were discordant in reporting the number (49.3%) or description (67.6%) of SAEs (table 4). In half of these pairs, the reported number of SAEs differed by more than 20% between the two sources.
Journal articles and associated trial summaries failed to describe a substantial proportion of SAEs. Most cases of death (62.3%) and suicide (53.3%) cited in trial summaries were not reported in associated journal articles (table 3).
The 34 trial summaries with zero SAEs were 2.35 (95% CI, 1.58 to 3.49; p<0.001) times as likely to have an associated journal article reporting this data consistently with the trial summary data as were the 181 summaries with one or more SAEs.
Explanations for discrepant reporting
Seventy (49.3%) of the 142 trial summary-journal article pairs were discrepant in SAE reporting. Half of these instances might be explained by differences between sources in the study length or phase being reported (25%, 18/70) or in the reporting criteria used (25%, 18/70). Table 5 provides examples of each of these forms of discrepant reporting. Importantly, while some journal articles appeared to apply more restrictive reporting criteria that might lead to omitting certain data, the many articles that did report exact SAE numbers often did so regardless of presumed causality to the study drug. For example, articles and summaries for olanzapine trials 3131 and 7031 reported all SAEs even though some events were thought to be unrelated to the study drug. Yet, the article for olanzapine trial 4414 separately details SAEs thought to be related and unrelated to the drug.20 Thus, no clear or consistent pattern on SAE reporting criteria emerged from this sample of journal articles.
Another one-third (34.3%, 24/70) of discrepancies appear to be simple failures of journal articles to report complete SAE data (see table 5). In a minority (14.3%, 10/70) of cases, however, the journal article provided more precise data or a higher number of SAEs than the trial summary. The article for aripiprazole trial CN138–050, for example, cites six SAEs in drug-treated participants,21 while the summary states only that the incidence of SAEs was low.
Post hoc analysis of clinical trial summaries on clinicaltrials.gov
In December 2011, clinicalstudyresults.org was removed from the Internet for unknown reasons. The Internet archive for the website (found here: Internet archive) suggests that the expansion of other registries made clinicalstudyresults.org seem redundant from the industry's perspective.22 One year after this removal of the registry, we cross-checked our data source by searching for each of the 244 trial summaries on clinicaltrials.gov. (In that database, the US Food and Drug Administration Amendment Act (FDAAA) of 2007 newly mandated trial sponsors to include summary reporting of results for trials that were initiated after or ongoing as of late 2007.) Our search revealed that 139 (57%, range across drugs: 25–80%) of the 244 trials were registered on clinicaltrials.gov, but only 15 of these (10.8%, range across drugs: 0–39%) had posted study results. In October 2013, nearly 2 years after the clinicalstudyresults.org takedown, these numbers had only slightly budged, with 19 registered trials now reporting study results. While nearly all (99%) of the trial summaries not currently registered on clinicaltrials.gov have trial start or completion dates prior to 2007, 75% of trial summaries that are registered on the website also have pre-2007 trial dates. In the interest of openness and transparency, we created a publicly accessible website (http://www.rxarchives.com) where all 244 trial summaries are posted in pdf format and freely available for download.
This study demonstrates that a substantially lower number of SAEs appear in published journal articles than registered trial summaries of antidepressant and antipsychotic drug trials, and shows further that both sources for drug information are often inconsistent or ambiguous in SAE reporting. In this study, 43.2% of all SAEs appearing in 142 trial summaries posted on an online registry across six psychotropic drugs were not reported in the first associated stand-alone journal articles listed by the drug's manufacturer. Failure to describe the nature of SAEs was also common in both sources. Given that many consumers of psychotropic drugs take these medications for months or years, that approximately one-quarter of journal articles reported only acute phase results of longer term trials and that the median study length in trial summaries with an associated journal article (12 weeks) was 4 weeks shorter than in trials without a journal article highlight an additional attrition of evidence on longer term outcomes.
These findings are congruent with other recent analyses demonstrating more complete outcomes information in registered clinical trial summaries compared to published journal articles,9 although examination of full clinical study reports reveals that both of the latter sources suffer from incomplete reporting of key data.10 Similar to our results, Riveros et al9 found that registered trial summaries (99%; present study 90%) more often report data on SAEs compared to published articles (63%; present study 60%). However, in an analysis comparing publicly available data in registered clinical trial summaries and journal publications to full clinical study reports submitted to a regulatory agency for drug products, the former sources reported complete information on harms outcomes significantly less (∼25%) than clinical study reports (87% of harms outcomes reported completely).10 SAEs, specifically, were reported completely only 51% of the time in journal articles and trial summaries, and 30% of SAE outcomes were not reported at all in these sources. In their analysis of full clinical study reports on the influenza drug Tamiflu, Doshi, Jefferson and Del Mar3 are alarmed by the important data remaining unknown to most physicians when clinical trial information is limited to the published journal literature. The occurrence of SAEs and the rationales for classifying events as adverse are among many possible discoveries in clinical study reports that can markedly alter a drug's benefit-to-risk profile. While publication bias of this sort in the literature has long been acknowledged or suspected,23–25 the present study clarifies the degree to which such bias distorts the perception of important harms outcomes (ie, number and nature of SAEs) across two classes of popularly used psychotropic drugs. Also, this study adds to the evidence base questioning whether information posted in online clinical trial registries represents meaningful improvement.
For another 102 trials with no associated stand-alone journal article in the present study, the clinical trial summaries report an additional 1423 SAEs and represent the only publicly available data source on these trials. In a recent examination of 585 large randomised trials registered on clinicaltrials.gov, 29% had no associated journal publication and most (78%) of those also had no results available on the clinicaltrials.gov registry.26 Riveros et al9 found that 50% of 594 randomly sampled controlled drug trials on clinicaltrials.gov had no corresponding published article. These findings highlight the necessity for clinicians, researchers and decision-makers to consult multiple sources in order to achieve a comprehensive and more complete appraisal of drugs’ safety profiles, although again, clinical trial summaries are themselves limited by incomplete reporting10 ,27 and by regulatory policies that require registration of only recent1 or new trials.28
Our post hoc analysis further revealed that, while 57% (139/244) of the present sample of trial summaries are registered on clinicaltrials.gov, only 7.8% (19/244) are available on the registry with results. Three-quarters of these currently registered trials have trial start or completion dates prior to 2007, thereby suggesting that actual registration practices on clinicaltrials.gov may be more inclusive than the minimum requirements set out by the FDAAA. Access to the full evidence base of drugs currently in use, including recent studies and those conducted prior to widespread deployment of registries, is essential for sound treatment decision-making and the assurance of present day patient safety,10 ,29 but the important efficacy and harms information contained in these 225 trials on six psychotropic drugs has been lost or scattered. As of this writing, Pfizer (sertraline and ziprasidone) and Bristol-Myers Squibb (aripiprazole) company websites include trial summaries or links to clinicaltrials.gov only for trials completed or ongoing as of 2007, in accordance with FDAAA guidelines. All clinical trial summaries included in the present analysis for atomoxetine, duloxetine,and olanzapine are available on Eli Lilly's company website. Some data, then, have been lost to the evidence base with the removal of clinicalstudyresults.org, while other data are still available but no longer accessible through a single repository. The important harms data contained in the present body of trial summaries provides further support for the recommendation that all ongoing, recent and archive drug trials for all new and existing drugs be made available to clinicians and consumers in a clear and accessible format, including links between all trial-related documents (journal articles, registry records, trial protocol and so on) for transparent navigation of each trial component to the core study.10 ,30 ,31
The present study has important limitations and strengths. First, although participating industry trial sponsors had posted on their respective websites statements of their commitment to posting all trial results in a timely manner on clinicalstudyresults.org, the completeness and accuracy of trial summaries on clinicalstudyresults.org could not be verified. However, since our crosscheck of summaries on clinicaltrials.gov revealed that few of these trials were transferred with results, our present analysis provides a glimpse on unique trial evidence that a contemporary standard database fails to capture. Second, only the first stand-alone journal article for each trial was included in this analysis. For trials with multiple publications, additional information on SAEs might appear in subsequent articles. However, this possibility might be slight as the median number of journal articles per trial summary was one, and over half of total articles listed for the six drugs were pooled or subset analyses or conference abstracts. We do not know whether the trends observed in the 142 trial summary-journal article pairs would hold for the other 102 trials. Finally, the results of this study cannot be generalised to other drugs and drug classes, but they do add to the substantial body of empirical findings demonstrating poor adverse event assessment and reporting practices and a distortion of evidence through selective reporting of industry-sponsored psychotropic drug research.24 ,32–35
The integrity of the medical treatment knowledge base preserves sound clinical practice and ensures patient safety. If nearly half of SAEs in psychotropic drug research are not reported in journal articles, and many others can only be found in sources not easily accessed by relevant treatment decision-makers,3 ,10 ,36 then, without integrating multiple data sources, benefit-to-harm assessments made by groups constructing clinical guidelines and by individual clinicians making prescription decisions are based on incomplete evidence and likely biased toward underestimating risks. Multiple solutions to the grave problem of incomplete reporting of clinical trials have been proposed, and some recent strides have been made. Some suggest shifting toward public funding and control of drug research in order to produce credible information accessible and transparent to all stakeholders.37–40 Some propose to treat failures to disclose complete knowledge of adverse effects from clinical trials as criminal offences requiring criminal prosecution of responsible individuals and companies.41 At the same time, ongoing campaigns have gained momentum across the UK in calling for pharmaceutical manufacturers to share clinical study reports on all drugs in use31 and in the USA for sharing clinical trial datasets with independent scientists.42 Many agree, however, that regulatory requirements for registering new and ongoing studies does not adequately protect the millions of patients currently taking prescription drugs,10 ,31 and the pharmaceutical industry has been slow and resistant to accepting the level of openness that scientists and the public have been calling for.31 ,43
The present findings highlight inconsistencies in harms-related reporting between published articles and trial registry summaries of psychotropic drugs, and indicate that clinical decisions regarding drug use may be based on substantially truncated evidence. Policy discussions in this area should consider to what extent patients who use drugs, clinicians who prescribe drugs and the public who finance most of their use deserve access to complete and accurate scientific data from drug trials.
Contributors SH and DC conceived the idea of the study and were responsible for the design of the study. SH and RJ were responsible for the acquisition of the data, for undertaking data coding, analysis and produced the tables. DC also contributed to data coding and provided critical input into the analysis and interpretation of results. The initial draft of the manuscript was prepared by SH and then circulated repeatedly among all authors for critical revision.
Funding This research received no specific grant from any funding agency in the public, commercial or not-for-profit sectors.
Competing interests None.
Provenance and peer review Not commissioned; externally peer reviewed.
Data sharing statement All clinical trial summaries that were analysed in this study have been uploaded by the first author (SH) to a publicly accessible website (http://www.rxarchives.com) for download.