Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

The Use of Google Trends in Health Care Research: A Systematic Review

  • Sudhakar V. Nuti,

    Affiliation Center for Outcomes Research and Evaluation, Yale-New Haven Hospital, New Haven, Connecticut, United States of America

  • Brian Wayda,

    Affiliation Yale School of Medicine, New Haven, Connecticut, United States of America

  • Isuru Ranasinghe,

    Affiliation Center for Outcomes Research and Evaluation, Yale-New Haven Hospital, New Haven, Connecticut, United States of America

  • Sisi Wang,

    Affiliation Yale School of Public Health, New Haven, Connecticut, United States of America

  • Rachel P. Dreyer,

    Affiliation Center for Outcomes Research and Evaluation, Yale-New Haven Hospital, New Haven, Connecticut, United States of America

  • Serene I. Chen,

    Affiliation Yale School of Medicine, New Haven, Connecticut, United States of America

  • Karthik Murugiah

    karthik.murugiah@yale.edu

    Affiliation Center for Outcomes Research and Evaluation, Yale-New Haven Hospital, New Haven, Connecticut, United States of America

Abstract

Background

Google Trends is a novel, freely accessible tool that allows users to interact with Internet search data, which may provide deep insights into population behavior and health-related phenomena. However, there is limited knowledge about its potential uses and limitations. We therefore systematically reviewed health care literature using Google Trends to classify articles by topic and study aim; evaluate the methodology and validation of the tool; and address limitations for its use in research.

Methods and Findings

PRISMA guidelines were followed. Two independent reviewers systematically identified studies utilizing Google Trends for health care research from MEDLINE and PubMed. Seventy studies met our inclusion criteria. Google Trends publications increased seven-fold from 2009 to 2013. Studies were classified into four topic domains: infectious disease (27% of articles), mental health and substance use (24%), other non-communicable diseases (16%), and general population behavior (33%). By use, 27% of articles utilized Google Trends for casual inference, 39% for description, and 34% for surveillance. Among surveillance studies, 92% were validated against a reference standard data source, and 80% of studies using correlation had a correlation statistic ≥0.70. Overall, 67% of articles provided a rationale for their search input. However, only 7% of articles were reproducible based on complete documentation of search strategy. We present a checklist to facilitate appropriate methodological documentation for future studies. A limitation of the study is the challenge of classifying heterogeneous studies utilizing a novel data source.

Conclusion

Google Trends is being used to study health phenomena in a variety of topic domains in myriad ways. However, poor documentation of methods precludes the reproducibility of the findings. Such documentation would enable other researchers to determine the consistency of results provided by Google Trends for a well-specified query over time. Furthermore, greater transparency can improve its reliability as a research tool.

Introduction

New tools are emerging to facilitate health care research in the Big Data era. One form of Big Data is that which accumulates in the course of Internet search activities. Internet search data may provide valuable insights into patterns of disease and population behavior.[1] In fact, the Institute of Medicine recognizes that the application of Internet data in health care research holds promise and may “complement and extend the data foundations that presently exist”.[2] An early and well-known example of utilizing Internet data in health has been the surveillance of influenza outbreaks with comparable accuracy to traditional methodologies.[3].

One tool that allows users to interact with Internet search data is Google Trends, a free, publically accessible online portal of Google Inc. Google Trends analyzes a portion of the three billion daily Google Search searches and provides data on geospatial and temporal patterns in search volumes for user-specified terms.[4] Google Trends has been used in many research publications, but the range of applications and methods employed have not been reviewed. Furthermore, there are no guidance or agreed standards for the appropriate use of this tool. A critical appraisal of the existing literature would increase awareness of its potential uses in health care research and facilitate a better understanding of its strengths and weaknesses as a research tool.

Accordingly, we performed a systematic review of the health care literature using Google Trends. To characterize how researchers are using Google Trends, we classified studies by topic domain and study aim. We conducted a subanalysis of surveillance studies to further detail their methods and approach to validation. We also assessed the reproducibility of methods and created a checklist for investigators to improve the quality of studies using Google Trends. Finally, we address general limitations in using Google Trends for health care research.

Methods

Overview of Google Trends

Google Trends provides access to Internet search patterns by analyzing a portion of all web queries on the Google Search website and other affiliated Google sites.[5] A description of the user interface is shown in Figure S1. Users are able to download the output of their searches to conduct further analyses.

The portal determines the proportion of searches for a user-specified term among all searches performed on Google Search. It then provides a relative search volume (RSV), which is the query share of a particular term for a given location and time period, normalized by the highest query share of that term over the time-series.[6], [7] The user can specify the geographic area to study, whether a city, country, or the world; data is available for all countries worldwide. Furthermore, the user can choose a time period to study, ranging from January 2004 to present, divided by months or days. The user is also able to compare the RSV of up to five different search terms or the RSV of a particular search term between geographic areas and between time periods. In addition, the user can choose from 25 specific topic categories to restrict the search, each with multiple sub categories for >300 choices in total, such as “Health → Mental Health → Depression”.

With respect to search input, multiple terms could be searched in combination with “+” signs and terms can be excluded with “-” signs. Quotations can be used to specify exact search phrases.[8].

Study Selection

The review was conducted in accordance with PRISMA guidelines.[9] We included all studies that used Google Trends to answer research questions within the domain of health care. After an initial review, we included letters because they contained substantial original content. We also included studies using Google Insights for Search, a similar tool to Google Trends that was merged into Google Trends in 2012 (hereafter we will refer to studies using Google Insights for Search as using Google Trends for ease of reading).

We excluded studies that primarily focused on Google Flu Trends, a separate tool to specifically track seasonal variation in influenza trends. This tool is distinct from Google Trends and is therefore beyond the scope of this review. We also excluded articles that had no substantial use of Google Trends.

Search Strategy

We identified relevant studies by searching Ovid MEDLINE (from inception to January 3, 2014) using a comprehensive search strategy. The list of subheadings (MeSH) and text words used in the search strategy for MEDLINE can be found in Appendix S1. We only included studies of humans written in the English language, and identified 1249 potential articles for inclusion. Since PubMed contains articles from life science journals in addition to articles indexed in MEDLINE, we conducted a search of PubMed (from inception to January 3, 2014) using a similar search strategy, but excluding the articles already identified from MEDLINE. This search identified an additional 871 potential articles, for a total of 2120 potential articles.

Two reviewers (S.V.N. and K.M.) independently reviewed the titles and abstracts of retrieved publications, and 92 articles met our inclusion criteria for full text review. We then excluded 25 studies that did not utilize Google Trends or that met at least one of our exclusion criteria (See Figure 1). We also included 3 articles found from the review of references.

The remaining 70 studies that met our inclusion criteria and did not meet our exclusion criteria form the studies included in this review. Data were abstracted from these studies using a standardized instrument, described below and in Table 1. All extractions were performed by at least two of the authors, and disagreements were resolved by consensus. We did not pool the results due to the heterogeneity of the articles, but we provide summary statistics.

Evaluation of studies

Article Classification.

To characterize how researchers are using Google Trends, we created a general descriptive classification of the articles according to their topic domain and study aim using an iterative process. The research team first worked together to examine all of the articles and identify common themes among the articles. We then assigned each article to the themes that emerged. After this initial step, we reassessed these groupings, refining the categorical domains and reassigning articles as needed, to create a classification construct that best characterizes the articles in the review. All disagreements during this process were resolved by consensus. This resulted in four topic domains (infectious disease, mental health and substance use, other non-communicable diseases, general population behavior) and three study aims (causal inference, description, surveillance). Of note, to categorize study aim, we examined the primary aim of the study as stated by the authors in the introduction of the paper. The definitions of these categories are described in the results.

Variable Abstraction.

The variables extracted, along with the standard definitions and rationale for their selection, are listed in Table 1. These pertain to each study’s purpose, methodology (search variables, search input, and type of analysis), primary findings, and citations accrued.

We defined whether an article was “reproducible” based on whether the authors provided a clear documentation of all fields modifiable by the user, namely location of search, time period of search, query category, and terms utilized, as well as the clear documentation of combination used and quotations used when applicable. Only articles that clearly provided each of these fields (or were deemed not applicable for a given field(s) with all other fields provided) were defined as reproducible. We defined “clear search input” as providing a clear use of quotations and/or combination when applicable (see Table 1).

Subanalysis of Surveillance Studies.

Following on the popularity of Google Flu Trends, there is particular interest in the potential use of Google Trends data to be operationalized as independent surveillance systems for other diseases. However, such surveillance systems require high standards of testing and validation before being deployed in the real world. Given these particular challenges, we performed a subanalysis of surveillance studies (as determined by study aim), abstracting additional information including the data sources used for validation, the strength of the relationship between predictions and external data, and other methodological characteristics listed in Table 1 and Table S1. We only assessed if validation data was used in surveillance studies, but we did not assess the quality of the validation process or data.

Assessment of Bias.

Conventional tools to assess bias are largely limited to randomized trials and observational studies and are not readily applicable to studies using Google Trends data, which is observational in nature but does not involve individual research participants.[10], [11] Therefore, we attempted to address the two primary sources of potential bias within these studies: the search strategy and the validation of surveillance studies. Search methodology may introduce bias, as the selection of terms and changes in search input can alter results. We therefore captured all aspects pertinent to search strategy, including the provision of rationale for search input, for each article. The data sources and methods for validating findings in surveillance studies are also sources for bias, which we assessed in our subanalysis. We assessed for publication bias by evaluating the number of studies with positive findings versus neutral/negative findings.

Results

Study Sample

The 70 articles included in this systematic review are outlined in Table 2. Overall, 92% were original articles; the remaining were letters. Among the articles identified, we observed a seven-fold increase in publications utilizing Google Trends from 2009 to 2013 (Figure S2). Sixty-three percent of the articles chose to study areas outside of the United States alone. The median number (interquartile range) of article citations was 7 (1,16). The majority of studies (93%) presented positive findings with the tool compared to neutral/negative findings, indicating the possibility of publication bias.

Classification of Published Google Trends Articles

Topic Domain.

We classified articles by the primary topic addressed by each article. By consensus we identified four main topic domains: infectious diseases (27% of articles), mental health and substance use (24%), other non-communicable diseases (16%), and general population behavior (33%). The general population behavior category included all health-related behaviors excluding mental health and substance use.

Study aim.

There were three categories of study aim: causal inference (27%), description (39%), and surveillance (34%). We defined causal inference studies as those in which the primary aim was to evaluate a hypothesized causal relationship with Google Trends data. An example of a causal inference study is Ayers et al. (2014), who used search queries to explore the potential link between a public figure’s cancer diagnosis and population interest in primary cancer prevention. We defined descriptive studies as those that aimed to describe temporal or geographic trends and general relationships, without reference to a hypothesized causal relationship. An example of a descriptive study is Stein et al. (2013), who assessed public interest in LASIK surgery and how levels of interest have changed over time in the United States and other countries. A particular subset of descriptive studies were surveillance studies, which we defined as those in which the stated aim was to evaluate the use of Google Trends to predict or monitor real-world phenomena. An example of a surveillance study is Desai et al. (2012), who assessed whether Google search trends are appropriate for monitoring Norovirus disease.

Methodology of Published Google Trends Articles

Documentation of Search Strategy.

Table 3 summarizes the documentation of search strategy. Only 34% of articles documented the date the search was conducted.

Of the variables that can be manipulated within the portal, within the methods section of the papers, 87% documented the location searched, 87% documented the time period searched, and only 19% clearly stated the query category used.

With respect to the search input, only 39% provided a clear search input. Excluding studies with only a single, one-word search term, which are not eligible for using quotations or combinations of terms, only 31% provided a clear search input. Of the articles eligible for using quotations (search terms with >1 word), 81% were unclear and 19% did not use quotations; none provided a clear use of quotations. Of the articles eligible for using a combination of terms (>1 search term used), 31% used a combination, 18% were unclear, and 51% used individual terms.

Reproducibility and Rationale.

Overall, only 7% of articles provided requisite documentation for a reproducible search strategy within their methods section; among original articles alone it was 8%.

In addition, we found that only 67% of articles provided a rationale for their search input.

Analytic Method.

Time trend analysis (comparisons across time periods) was used by 70% of the studies, cross-sectional analysis (comparisons across different locations at a single time period) by 11% of studies, and both by 19%. A variety of analytic methods were used in conjunction with Google Trends output data, including correlation, continuous density hidden Markov models, ANOVA, Box–Jenkins transfer function models, t-tests, autocorrelation, multivariable linear regression, time series analyses, wavelet power spectrum analysis, Cosinor analysis, and the Mann-Whitney test.

Subanalysis of Surveillance Articles and Validation

Among the 24 surveillance studies, 71% used time trend analysis, 25% cross-sectional analysis, and 4% both. Among articles using time trend analysis, 33% utilized lead-time analysis (using Google Trends data from a specific time point/interval to predict events occurring at a later time). Overall, 17% of studies used training/testing data sets and 13% had a time horizon (time period over which surveillance was assessed) <1 year. More detailed information can be found in Table S1.

With respect to validation, 92% made comparisons against external datasets to validate the Google Trends output; the remaining 8% did not validate their findings. Examples of sources of comparison datasets include disease prevalence data from centers such as the United States Centers for Disease Control, drug revenues from shareholder reports of pharmaceutical companies, and unemployment rates from the Australian Bureau of Statistics.

There was a wide range of correlation statistics, from 0.04 to 0.98 (Figure S3). Among the 15 papers that used Pearson product-moment correlation, 80% had at least one correlation statistic greater than 0.70.

Checklist for the Documentation of Google Trends Use

In view of the limitations of existing studies identified during this review, we developed a checklist to improve the quality and reproducibility of studies that use Google Trends in the future (Table 4). This was created based directly on the variables that can be manipulated within the Google Trends portal, differences in which could provide differing results among researchers, and the need to provide search strategy rationale. A hypothetical example of a “well-documented” search strategy is included within Table 4. Of note, we used brackets to separate the search input from the body text to ensure that the reader understands what was searched for with clear syntax; similar approaches of segregation might be used.

Discussion

In this systematic review of the use of Google Trends in healthcare research, we found that researchers are increasingly utilizing the tool in a diversity of areas in myriad ways; these articles are being widely cited. Furthermore, the majority of surveillance studies validated Google Trends output against external datasets and many had strong correlation statistics. However, the majority of studies lack thorough documentation of search methodologies, which precludes the reproducibility of results; less than 10% of articles are reproducible. In addition, search rationale is often not provided. Thus, while the data within Google Trends holds promise, significant variability and limitations remain around study quality and reliability.

The 70 papers included in our review reflected a wide variety of topics and uses. A large proportion of articles used Google Trends to investigate population behavior, which is a logical application of the tool given its basis in user searches. The large proportion of infectious disease articles may stem from the precedent set by Google Flu Trends.[3] Nearly equal numbers of studies used Google Trends for causal inference, surveillance, and description, demonstrating the ability to use the tool to answer a range of questions. There was an increase in publications over time, and the median citation rate (7 per article) is comparable to the average for all scientific articles (7.64 per article).[12] These observations suggest increasing awareness of and the leveraging of information from the tool. Locations studied using the tool were geographically widespread, particularly outside of the United States where conventional data collection may be challenging and resource intensive. Nevertheless, there is evidence of a positive results publication bias, which may be due to the novelty of the tool and authors not submitting and/or editors not accepting negative – and, therefore, perhaps uninteresting – results.[13], [14] This publication bias also may be due to researchers retroactively constructing hypotheses about interesting findings after the results are known for a given Google Trends experiment, which can be fast and easy to conduct.[15].

Despite the potential insights and research opportunities that Google Trends provides, many problems were observed with the documentation of methodology. Thorough documentation is necessary to ensure the reproducibility and replicability of the results, which are fundamental tenets of good science.[16] The inability to reproduce studies in the sciences has been well-documented, and it serves as a central problem to the utility and credibility of research.[17], [18] Yet, in our study, only 7% of articles provided clear documentation of the necessary fields to be reproducible. This is especially salient for using Google Trends given the many search fields and multiple options available within each field. Researchers may not have known how to document their methods since this is still a nascent tool for research, without guidance or methodological standards for its use from either Google Inc. or the research community. Furthermore, there were particular issues with the clarity of search inputs. For example, it was often unclear whether quotations provided for a search term were actually used in the search input or were merely given to distinguish the term from the rest of the text. A potential reason for varying presentation of search input syntax may be that possible search syntaxes may have changed over time.[8], [19] Nevertheless, the poor documentation of methods also raises larger questions about researchers using Google Trends without knowing the ways in which the tool can be operated.

Different selections of terms to address a common question with Google Trends can produce disparate results and conclusions, and providing the rationale behind these selections is necessary for a reader to better understand the study methods and to increase the face validity of the study.[20] Yet, studies often provided no rationale for their search input. For instance, we do not know why studies chose a given selection of terms or used a specific syntax. Furthermore, there are larger questions about the search strategy as a whole, such as why certain query categories and dates for searching were chosen. Nevertheless, certain studies demonstrated more thorough search strategies and strong rationales for search inputs, particularly accounting for the basis of Google Trends data in user searches. For instance, Desai et al. (2012) included potential misspellings of their search words to fully capture a specific search pattern. In addition, Cho et al. (2013) developed their search inputs by surveying their population of interest, in which they inquired about what search terms subjects would have used to search for influenza. Similar strategies could be adopted by future studies to ensure that their search terms accurately capture the outcome of interest. More guidance is needed by Google to assist researchers in how to produce an optimal search strategy to answer a given question.

Over 90% of surveillance studies compared Google Trends with established data sets, which were often trusted sources of surveillance data. A large number of correlation studies had moderate to strong strengths of association, which demonstrates the potential of Google Trends data to be used for the surveillance of health-related phenomena. For example, Jena et al. (2013) found a strong correlation between searches for HIV and US CDC HIV incidence rates, and were able to construct a model based on searches from years 2007–2008 to accurately predict state HIV incidence for 2009–2010. Moreover, Samaras et al. (2012) showed that Google Trends could have been used to forecast the peak of scarlet fever in the UK 5 weeks before its arrival. Although studies are promising, strong correlations alone do not support the use of Google Trends for surveillance, and further work is needed to substantiate the reliability and real world applicability of Google Trends as a tool to monitor health-related phenomena.

In light of our results, we have proposed a basic checklist for the documentation of Google Trends use. This checklist can serve as a baseline standard to ensure methodological understanding and reproducibility by researchers who choose to use the tool in the future.

While this checklist is a good step forward to improve the reproducibility of results by researchers, there are still larger limitations in the Google Trends tool itself. We cannot clearly ascertain user characteristics and intent from search data, which limits the ability to draw generalizable conclusions about population search behavior. In addition, Google Trends captures the search behavior of only a certain segment of the population – those with Internet access and using Google Search instead of other search engines. However, the major limitation of Google Trends is the lack of detailed information on the method by which Google generates this search data and the algorithms it employs to analyze it. Furthermore, temporal changes in the interface and capabilities of the Google Trends over time are not documented, which may lead to variation in the search output and therefore study findings.

Moving forward, several steps can be taken to improve the verification of Google Trends study results and the reliability of the tool for research, both on the part of the independent researcher and Google Inc. Researchers should strive to make the raw data they downloaded from the Google Trends available online (as Yang et al., 2010 did [21]) and create an archive or screenshot of the website as they searched it (as Sueki et al., 2011 did [22]) to provide transparency of their methodology and encourage open science with this open tool. Researchers could also evaluate the methods and results of others and themselves to check if there is consistency over time. We encourage Google Inc. to provide a chronology of important changes to Google Trends – in the past and to come – to put researchers’ methods and findings in context. Furthermore, if Google Trends continues to be used for research purposes, a discussion and collaboration between Google Inc. and the research community is necessary to create a set of best practices to ensure that the tool is being used responsibly and that its tremendous potential to derive meaningful insights from population search behavior could be fully harvested. While full transparency may not be possible due to commercial sensitivities, informed guidance is needed to ensure the conduct of ethical science. For example, Google Inc. could work together with groups of researchers to detail how to construct optimal queries to fully take advantage of the algorithms at work and to improve the tool to increase the quality of research. In addition, it is important to remember that these conclusions apply not only to Google Trends, but also other similar tools, which currently exist or will likely emerge from existing data sources, that are not intrinsically designed to be utilized for research. In a Big Data era where information and technologies, particularly those that are readily accessible to the public and research community, are growing, mindfulness must be paid to their application in scientific research and efforts must be made to ensure the conduct of good science. One must look no further than the recent controversy around the reliability of Google Flu Trends data to predict influenza incidence and the lack of transparency and inability to verify its results.[23].

Our study has certain limitations. First, given the diversity of topics and uses, there are inherent challenges in the classification of articles. However, at least two independent abstractors reviewed each article and category of abstraction, and disagreements were resolved by group consensus. Second, there are no prior standards to evaluate literature from novel data sources such as Google Trends. Third, our assessment of Google Trends was based on the current syntactic possibilities, but they may have changed over time.[8], [19] Conversely, this supports our concerns about undocumented changes to the tool. Finally, there is a possibility that we had an incomplete retrieval of Google Trends articles in our search strategy. However, we conducted an extensive, systematic search of two databases, in addition to reviewing article references, to capture as many articles as possible. Notably, our study focused on the evaluation of the use of Google Trends in research, and we refrained from making any commentary about the conclusions drawn by researchers in these studies. Further studies are needed to rigorously evaluate the interpretations of causal inference studies and the validity of Google Trends for surveillance.

Conclusion

Google Trends holds potential as a free, easily accessible means to access large population search data to derive meaningful insights about population behavior and its link to health and health care. However, to be reliably utilized as a research tool, it would have to be more transparent, which will increase the trustworthiness of both the results generated and its general applicability for health care research. Furthermore, researchers must make efforts to clearly state their rationale and document their experiments to ensure the reproducibility of results. The lessons gleaned from this review are also instructive for other tools not intrinsically designed for research that may emerge in an era of Big Data to ensure that they are used appropriately by the scientific community.

Supporting Information

Table S1.

Surveillance Variable Abstraction Results.

https://doi.org/10.1371/journal.pone.0109583.s002

(DOCX)

Figure S1.

Google Trends Web Page Output. Screenshot of a Google Trends search output when queried for 3 terms: [“Google Trends”], [“Google Insights”], and [“Google Trends” + “Google Insights”]. We searched Worldwide, using all query categories, for the time period from January 2004 to March 2014 (site accessed: 3/17/14).

https://doi.org/10.1371/journal.pone.0109583.s003

(TIFF)

Figure S2.

Distribution of Articles Included in Our Review by Year of Publication. Notably, we did not include those articles published in 2014 (n = 5) in the figure, as they represent only part of that year.

https://doi.org/10.1371/journal.pone.0109583.s004

(TIFF)

Figure S3.

Forrest Plot of Measures of Association for Surveillance Studies Using Pearson’s Correlation. Plot of correlation statistics from each surveillance study that used Pearson’s correlation. For studies with multiple correlation statistics, each was plotted individually.

https://doi.org/10.1371/journal.pone.0109583.s005

(TIFF)

Acknowledgments

We would like to thank Dr. Harlan M. Krumholz for his guidance with the manuscript.

Author Contributions

Conceived and designed the experiments: SVN KM IR BW. Performed the experiments: SVN KM IR BW SW RPD SIC. Analyzed the data: SVN KM IR BW. Contributed reagents/materials/analysis tools: SVN KM IR BW SW RPD SIC. Contributed to the writing of the manuscript: SVN KM IR BW SW RPD SIC.

References

  1. 1. Brownstein JS, Freifeld CC, Madoff LC (2009) Digital disease detection–harnessing the Web for public health surveillance. N Engl J Med 360: 2153–2155, 2157.
  2. 2. Barrett-Connor E, Ayanian JZ, Brown ER, Coultas DB, Francis CK, et al. (2011) A Nationwide Framework for Surveillance of Cardiovascular and Chronic Lung Diseases.
  3. 3. Ginsberg J, Mohebbi MH, Patel RS, Brammer L, Smolinski MS, et al. (2009) Detecting influenza epidemics using search engine query data. Nature 457: 1012–U1014.
  4. 4. Google (2014) Google Trends. Available: http://www.google.com/trends/. Accessed 2014 April 25.
  5. 5. Google (2014) Google Trends Help. https://support.google.com/trends/. Accessed: 04/25/14.
  6. 6. Mondria J, Wu T (2013) Imperfect financial integration and asymmetric information: competing explanations of the home bias puzzle? Canadian Journal of Economics-Revue Canadienne D Economique 46: 310–337.
  7. 7. Choi HY, Varian H (2012) Predicting the Present with Google Trends. Economic Record 88: 2–9.
  8. 8. Google (2014) How to type your search term. Available: https://support.google.com/trends/answer/4359582?hl=en&ref_topic=4365599. Accessed 2014 April 25.
  9. 9. Moher D, Liberati A, Tetzlaff J, Altman DG (2009) Group P (2009) Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med 6: e1000097.
  10. 10. Stroup DF, Berlin JA, Morton SC, Olkin I, Williamson GD, et al. (2000) Meta-analysis of observational studies in epidemiology: a proposal for reporting. Meta-analysis Of Observational Studies in Epidemiology (MOOSE) group. JAMA 283: 2008–2012.
  11. 11. Viswanathan M, Ansari MT, Berkman ND, Chang S, Hartling L, et al. (2008) Assessing the Risk of Bias of Individual Studies in Systematic Reviews of Health Care Interventions. Methods Guide for Effectiveness and Comparative Effectiveness Reviews. Rockville (MD).
  12. 12. (2011) Citation averages, 2000–2010, by fields and years. Times Higher Education. Available: http://www.timeshighereducation.co.uk/415643.article. Accessed 2014 March 15.
  13. 13. Sackett DL (1979) Bias in analytic research. J Chronic Dis 32: 51–63.
  14. 14. Kicinski M (2013) Publication bias in recent meta-analyses. PLoS One 8: e81823.
  15. 15. Kerr NL (1998) HARKing: hypothesizing after the results are known. Pers Soc Psychol Rev 2: 196–217.
  16. 16. Popper KR (1959) The logic of scientific discovery. New York,: Basic Books. 479 p. p.
  17. 17. Asendorpf JB, Conner M, De Fruyt F, De Houwer J, Denissen JJA, et al. (2013) Recommendations for increasing replicability in psychology. European Journal of Personality 27: 108–119.
  18. 18. Ioannidis JPA (2005) Why most published research findings are false. PLoS medicine 2: e124.
  19. 19. Leonhardt D (2006) How to Use Google Trends. New York Times. Available: http://www.nytimes.com/2006/07/05/business/05leonhardt-aboutgoogtrends.html?_r=0. Accessed 2014 April 1.
  20. 20. Murugiah K, Ranasinghe I, Nuti SV (2014) Geographic obesity burden and Internet searches for bariatric surgery: importance of a combined search strategy. Surg Obes Relat Dis 10: 369–370.
  21. 21. Yang AC, Huang NE, Peng CK, Tsai SJ (2010) Do Seasons Have an Influence on the Incidence of Depression? The Use of an Internet Search Engine Query Data as a Proxy of Human Affect. Plos One 5.
  22. 22. Sueki H (2011) Does the volume of Internet searches using suicide-related search terms influence the suicide death rate: Data from 2004 to 2009 in Japan. Psychiatry and Clinical Neurosciences 65: 392–394.
  23. 23. Lazer D, Kennedy R, King G, Vespignani A (2014) The Parable of Google Flu: Traps in Big Data Analysis. Science 343: 1203–1205.
  24. 24. Metcalfe D, Price C, Powell J (2011) Media coverage and public reaction to a celebrity cancer diagnosis. Journal of Public Health 33: 80–85.
  25. 25. Huang J, Zheng R, Emery S (2013) Assessing the impact of the national smoking ban in indoor public places in china: evidence from quit smoking related online searches. PLoS One 8: e65577.
  26. 26. Ayers JW, Althouse BM, Allem JP, Ford DE, Ribisl KM, et al. (2012) A Novel Evaluation of World No Tobacco Day in Latin America. Journal of Medical Internet Research 14: 288–298.
  27. 27. Ayers JW, Althouse BM, Noar SM, Cohen JE (2014) Do celebrity cancer diagnoses promote primary cancer prevention? Prev Med 58: 81–84.
  28. 28. Ayers JW, Ribisl KM, Brownstein JS (2011) Tracking the Rise in Popularity of Electronic Nicotine Delivery Systems (Electronic Cigarettes) Using Search Query Surveillance. American Journal of Preventive Medicine 40: 448–453.
  29. 29. Ayers JW, Ribisl K, Brownstein JS (2011) Using Search Query Surveillance to Monitor Tax Avoidance and Smoking Cessation following the United States’ 2009 “SCHIP” Cigarette Tax Increase. Plos One 6.
  30. 30. Kostkova P, Fowler D, Wiseman S, Weinberg JR (2013) Major Infection Events Over 5 Years: How Is Media Coverage Influencing Online Information Needs of Health Care Professionals and the Public? Journal of Medical Internet Research 15: 167–190.
  31. 31. Glynn RW, Kelly JC, Coffey N, Sweeney KJ, Kerin MJ (2011) The effect of breast cancer awareness month on internet search activity - a comparison with awareness campaigns for lung and prostate cancer. Bmc Cancer 11.
  32. 32. McDonnell WM, Nelson DS, Schunk JE (2012) Should we fear “flu fear” itself? Effects of H1N1 influenza fear on ED use. American Journal of Emergency Medicine 30: 275–282.
  33. 33. Ayers JW, Althouse BM, Ribisl KM, Emery S (2013) Digital Detection for Tobacco Control: Online Reactions to the United States’ 2009 Cigarette Excise Tax Increase. Nicotine Tob Res.
  34. 34. Reis BY, Brownstein JS (2010) Measuring the impact of health policies using Internet search patterns: the case of abortion. Bmc Public Health 10.
  35. 35. Fenichel EP, Kuminoff NV, Chowell G (2013) Skip the Trip: Air Travelers’ Behavioral Responses to Pandemic Influenza. Plos One 8.
  36. 36. Stein JD, Childers DM, Nan B, Mian SI (2013) Gauging interest of the general public in laser-assisted in situ keratomileusis eye surgery. Cornea 32: 1015–1018.
  37. 37. Ayers JW, Althouse BM, Allem JP, Rosenquist JN, Ford DE (2013) Seasonality in seeking mental health information on Google. Am J Prev Med 44: 520–525.
  38. 38. Murugiah K, Rajput K (2010) Cardiopulmonary resuscitation (CPR) survival rates and Internet search for CPR: Is there a relation? Resuscitation 81: 1733–1734.
  39. 39. Carr LJ, Dunsiger SI (2012) Search Query Data to Monitor Interest in Behavior Change: Application for Public Health. Plos One 7.
  40. 40. Connolly MP, Postma M, Silber SJ (2009) What’s on the mind of IVF consumers? Reproductive Biomedicine Online 19: 767–769.
  41. 41. Davis NF, Breslin N, Creagh T (2013) Using Google Trends to Assess Global Interest in ‘Dysport (R)’ for the Treatment of Overactive Bladder. Urology 82: 1189–1189.
  42. 42. Bentley RA, Ormerod P (2010) A rapid method for assessing social versus independent interest in health issues: A case study of ‘bird flu’ and ‘swine flu’. Social Science & Medicine 71: 482–485.
  43. 43. Liu R, Garcia PS, Fleisher LA (2012) Interest in Anesthesia as Reflected by Keyword Searches using Common Search Engines. J Anesth Clin Res 3.
  44. 44. Hill S, Mao J, Ungar L, Hennessy S, Leonard CE, et al. (2011) Natural Supplements for H1N1 Influenza: Retrospective Observational Infodemiology Study of Information and Search Activity on the Internet. Journal of Medical Internet Research 13.
  45. 45. Markey PM, Markey CN (2013) Seasonal variation in internet keyword searches: a proxy assessment of sex mating behaviors. Arch Sex Behav 42: 515–521.
  46. 46. Schuster NM, Rogers MAM, McMahon LF (2010) Using Search Engine Query Data to Track Pharmaceutical Utilization: A Study of Statins. American Journal of Managed Care 16: E215–E219.
  47. 47. Davis NF, Smyth LG, Flood HD (2012) Detecting internet activity for erectile dysfunction using search engine query data in the Republic of Ireland. Bju International 110: E939–E942.
  48. 48. Harsha AK, Schmitt JE, Stavropoulos SW (2014) Know Your Market: Use of Online Query Tools to Quantify Trends in Patient Information-seeking Behavior for Varicose Vein Treatment. J Vasc Interv Radiol 25: 53–57.
  49. 49. Breyer BN, Eisenberg ML (2010) Use of Google in study of noninfectious medical conditions. Epidemiology 21: 584–585.
  50. 50. Leffler CT, Davenport B, Chan D (2010) Frequency and seasonal variation of ophthalmology-related internet searches. Canadian Journal of Ophthalmology-Journal Canadien D Ophtalmologie 45: 274–279.
  51. 51. Ingram DG, Plante DT (2013) Seasonal trends in restless legs symptomatology: evidence from Internet search query data. Sleep Medicine 14: 1364–1368.
  52. 52. Brigo F, Igwe SC, Ausserer H, Nardone R, Tezzon F, et al. (2014) Why do people Google epilepsy?: An infodemiological study of online behavior for epilepsy-related search terms. Epilepsy Behav 31C: 67–70.
  53. 53. Bragazzi NL (2013) Infodemiology and infoveillance of multiple sclerosis in Italy. Mult Scler Int 2013: 924029.
  54. 54. Braun T, Harreus U (2013) Medical nowcasting using Google trends: application in otolaryngology. European Archives of Oto-Rhino-Laryngology 270: 2157–2160.
  55. 55. Walcott BP, Nahed BV, Kahle KT, Redjal N, Coumans JV (2011) Determination of geographic variance in stroke prevalence using Internet search engine analytics. Neurosurgical Focus 30.
  56. 56. Willard SD, Nguyen MM (2013) Internet Search Trends Analysis Tools Can Provide Real-time Data on Kidney Stone Disease in the United States. Urology 81: 37–42.
  57. 57. Johnson AK, Mehta SD (2014) A comparison of Internet search trends and sexually transmitted infection rates using Google trends. Sex Transm Dis 41: 61–63.
  58. 58. Polkowska A, Harjunpaa A, Toikkanen S, Lappalainen M, Vuento R, et al. (2012) Increased incidence of Mycoplasma pneumoniae infection in Finland, 2010–2011. Eurosurveillance 17: 8–11.
  59. 59. Seifter A, Schwarzwalder A, Geis K, Aucott J (2010) The utility of “Google Trends” for epidemiological research: Lyme disease as an example. Geospatial Health 4: 135–137.
  60. 60. Rossignol L, Pelat C, Lambert B, Flahault A, Chartier-Kastler E, et al. (2013) A Method to Assess Seasonality of Urinary Tract Infections Based on Medication Sales and Google Trends. Plos One 8.
  61. 61. Mytton OT, Rutter PD, Donaldson LJ (2012) Influenza A(H1N1)pdm09 in England, 2009 to 2011: a greater burden of severe illness in the year after the pandemic than in the pandemic year. Eurosurveillance 17: 11–19.
  62. 62. Jena AB, Karaca-Mandic P, Weaver L, Seabury SA (2013) Predicting New Diagnoses of HIV Infection Using Internet Search Engine Data. Clinical Infectious Diseases 56: 1352–1353.
  63. 63. Zheluk A, Quinn C, Hercz D, Gillespie JA (2013) Internet Search Patterns of Human Immunodeficiency Virus and the Digital Divide in the Russian Federation: Infoveillance Study (vol 15, pg e256, 2013). Journal of Medical Internet Research 15.
  64. 64. Althouse BM, Ng YY, Cummings DAT (2011) Prediction of Dengue Incidence Using Search Query Surveillance. Plos Neglected Tropical Diseases 5.
  65. 65. Carneiro HA, Mylonakis E (2009) Google Trends: A Web-Based Tool for Real-Time Surveillance of Disease Outbreaks. Clinical Infectious Diseases 49: 1557–1564.
  66. 66. Samaras L, Garcia-Barriocanal E, Sicilia MA (2012) Syndromic surveillance models using Web data: The case of scarlet fever in the UK. Informatics for Health & Social Care 37: 106–124.
  67. 67. Kang M, Zhong HJ, He JF, Rutherford S, Yang F (2013) Using Google Trends for Influenza Surveillance in South China. Plos One 8.
  68. 68. Pelat C, Turbelin C, Bar-Hen A, Flahault A, Valleron AJ (2009) More Diseases Tracked by Using Google Trends. Emerging Infectious Diseases 15: 1327–1328.
  69. 69. Desai R, Hall AJ, Lopman BA, Shimshoni Y, Rennick M, et al. (2012) Norovirus Disease Surveillance Using Google Internet Query Share Data. Clinical Infectious Diseases 55: E75–E78.
  70. 70. Desai R, Lopman BA, Shimshoni Y, Harris JP, Patel MM, et al. (2012) Use of Internet Search Data to Monitor Impact of Rotavirus Vaccination in the United States. Clinical Infectious Diseases 54: Cp8–Cp11.
  71. 71. Cho S, Sohn CH, Jo MW, Shin SY, Lee JH, et al. (2013) Correlation between National Influenza Surveillance Data and Google Trends in South Korea. Plos One 8.
  72. 72. Dukic VM, David MZ, Lauderdale DS (2011) Internet Queries and Methicillin-Resistant Staphylococcus aureus Surveillance. Emerging Infectious Diseases 17: 1068–1070.
  73. 73. Valdivia A, Monge-Corella S (2010) Diseases Tracked by Using Google Trends, Spain. Emerging Infectious Diseases 16: 168–168.
  74. 74. Zhou XC, Ye JP, Feng YJ (2011) Tuberculosis Surveillance by Analyzing Google Trends. Ieee Transactions on Biomedical Engineering 58.
  75. 75. Zhou X, Li Q, Zhu Z, Zhao H, Tang H, et al. (2013) Monitoring epidemic alert levels by analyzing Internet search volume. IEEE Trans Biomed Eng 60: 446–452.
  76. 76. Forsyth AJM (2012) Virtually a drug scare: Mephedrone and the impact of the Internet on drug news transmission. International Journal of Drug Policy 23: 198–209.
  77. 77. Ayers JW, Althouse BM, Allem JP, Childers MA, Zafar W, et al. (2012) Novel surveillance of psychological distress during the great recession. J Affect Disord 142: 323–330.
  78. 78. Tefft N (2011) Insights on unemployment, unemployment insurance, and mental health. Journal of Health Economics 30: 258–264.
  79. 79. Frijters P, Johnston DW, Lordan G, Shields MA (2013) Exploring the relationship between macroeconomic conditions and problem drinking as captured by Google searches in the US. Social Science & Medicine 84: 61–68.
  80. 80. Bright SJ, Bishop B, Kane R, Marsh A, Barratt MJ (2013) Kronic hysteria: Exploring the intersection between Australian synthetic cannabis legislation, the media, and drug-related harm. International Journal of Drug Policy 24: 231–237.
  81. 81. Hagihara A, Miyazaki S, Abe T (2012) Internet suicide searches and the incidence of suicide in young people in Japan. European Archives of Psychiatry and Clinical Neuroscience 262: 39–46.
  82. 82. Gallagher CT, Assi S, Stair JL, Fergus S, Corazza O, et al. (2012) 5,6-Methylenedioxy-2-aminoindane: from laboratory curiosity to legal high’. Human Psychopharmacology-Clinical and Experimental 27: 106–112.
  83. 83. Steppan M, Kraus L, Piontek D, Siciliano V (2013) Are cannabis prevalence estimates comparable across countries and regions? A cross-cultural validation using search engine query data. Int J Drug Policy 24: 23–29.
  84. 84. Song TM, Song J, An JY, Hayman LL, Woo JM (2014) Psychological and social factors affecting Internet searches on suicide in Korea: a big data analysis of Google search trends. Yonsei Med J 55: 254–263.
  85. 85. Yang AC, Tsai SJ, Huang NE, Peng CK (2011) Association of Internet search trends with suicide death in Taipei City, Taiwan, 2004–2009. Journal of Affective Disorders 132: 179–184.
  86. 86. Gunn JF, Lester D (2013) Using google searches on the internet to monitor suicidal behavior. Journal of Affective Disorders 148: 411–412.
  87. 87. McCarthy MJ (2010) Internet monitoring of suicide risk in the population. Journal of Affective Disorders 122: 277–279.
  88. 88. Bragazzi NL (2013) A Google Trends-based approach for monitoring NSSI. Psychol Res Behav Manag 7: 1–8.
  89. 89. Page A, Chang SS, Gunnell D (2011) Surveillance of Australian suicidal behaviour using the Internet? Australian and New Zealand Journal of Psychiatry 45: 1020–1022.
  90. 90. Yin S, Ho M (2012) Monitoring a toxicological outbreak using Internet search query data. Clinical Toxicology 50: 818–822.