Special RFP for COVID-19 Trust in Science Proposals

In 2020, a portion of the fund will support projects that employ data-driven approaches to advance our understanding of trust and mistrust in science in the context of the novel coronavirus pandemic.  The request for proposals (now closed) is available at the bottom of this page.

Funded projects include:

Mapping Covid-19 Misinformation
Yochai Benkler (Berkman Klein Center for Internet and Society, Harvard Law School)

Countering COVID-19 Misinformation Via WhatsApp in Zimbabwe

Kevin Croke (Harvard T.H. Chan School of Public Health), Jeremy Bowles (Harvard Faculty of Arts and Sciences), Horacio Larreguy (Harvard Faculty of Arts and Sciences), John Marshall (Columbia University), Shelly Liu (UC Berkeley)

Misinformation about health is a serious problem in the COVID-19 crisis, including in developing countries where access to credible scientific news sources is limited, and misinformation spreads virally via social media. We propose to experimentally evaluate two methods to address this in Zimbabwe, where fake news regarding COVID-19 has been identified as a major problem. This study will build on a previous collaboration with Zimbabwean NGO Kubatana which has demonstrated a promising role for WhatsApp messages for correcting misconceptions about the COVID-19 virus and encouraging preventive behavior. WhatsApp Chatbots have been described as a promising tool to fight COVID misinformation, including by the World Health Organization, but evidence on their effectiveness is limited. An experiment using WhatsApp in Zimbabwe increased knowledge about COVID-19 by 0.26 standard deviations and increased compliance with social distancing. However, scaling the dissemination of fact-checking is costly because it is labor-intensive. This project will innovate by using a WhatsApp Chatbot, disseminated by the same trusted local NGO, to take fact-checking dissemination to scale. It will also test the effectiveness of a popular mass-media fact-checking tool distributed via WhatsApp.

Ensuring privacy in COVID-19 epidemiological mobility data sets
Salil Vadhan (Harvard John A. Paulson School of Engineering and Applied Sciences), Satchit Balsari (Harvard T.H. Chan School of Public Health), Caroline Buckee (Harvard T.H. Chan School of Public Health), Merce Crosas (Institute for Quantitative Social Science), Gary King (Institute for Quantitative Social Science)

This project is a collaboration between the COVID-19 Mobility Data Network, co-founded by Harvard faculty Caroline Buckee and Satchit Balsari, and the OpenDP initiative, led by faculty directors Salil Vadhan and Gary King. It aims to apply differential privacy to vast amounts of COVID-19 mobility data to study the movement of individuals during social distancing restrictions while preserving their privacy.

Explainable AI for Promoting Trust in Science
Marinka Zitnik (Harvard Medical School), Himabindu Lakkaraju (Harvard Business School)

The Coronavirus Disease 2019 (COVID-19) pandemic has caused a severe strain on the health care systems as well as the economies of countries worldwide. The rapid spread and the disruptive nature of this pandemic call for a renewed public trust in science because this trust is found to be a critical factor in determining if the general public will comply with the health recommendations outlined by the authorities. This compliance, in turn, is key to solving the current crisis. To build, promote, and maintain public trust in science, we propose to develop novel computational frameworks that leverage explainable AI, an emerging area of artificial intelligence research that provides interpretable, easy-to-understand, yet highly accurate predictions. We will deploy our explainable AI toolset in several applications where trust in science is critical to curbing the spread of the virus and deploying new interventions rapidly. First, our AI tools will help doctors and healthcare professionals understand the functionality of complex ML models so they can decide if and when to trust these models. We believe this can have a significant impact on enabling medical professionals to leverage computational research in making informed decisions about diagnosis and treatment. Second, our AI tools will build trust more broadly by identifying what studies and scientific articles carry the most credible findings on COVID-19 so that the general public has the right information that they can rely on. In doing so, this project will provide a clear pathway to a more trustworthy scientific enterprise. It will promote trust in science among the general public as well as between scientists themselves.





This application is now closed.

Application Deadline: Monday, May 25, 2020, 11:59 p.m.

Submission Requirements: All proposals should be submitted online here.

Contact for questions:    Kevin Doyle at kevin_doyle@harvard.edu 

The Harvard Data Science Initiative Research Fund for Trust in Science supports research that advances understandings of trust and mistrust in science by leveraging data science, toward the goal of creating actionable insights that will advance trust in science.  The Fund is a multi-donor gift fund with initial seed funding provided by Bayer, an HDSI Corporate Member

In 2020, a portion of the fund will be used to support projects that employ data-driven approaches to advance our understanding of trust and mistrust in science in the context of the novel coronavirus pandemic.  Applications are due Monday, May 25,  2020, and will undergo expedited review by faculty peer reviewers, with awards made in June. 

Goal of the Special COVID-19 Trust in Science RFP

The goal of the Special COVID-19 Trust in Science RFP is to enable faculty across Harvard to study issues related to trust in science, broadly construed, in the context of the novel coronavirus pandemic.  The HDSI welcomes data-science related proposals from all disciplines that seek to explore issues such as: the credibility of data and data models, the spread of information and misinformation, issues related to communication of scientific findings in a pandemic, data-driven indicators of consensus around scientific findings and  recommendations, uptake of recommendations in different communities and national contexts, and the effect and impact of open data on public trust.  These are examples only and applicants are encouraged to think broadly. 

Matchmaking: The HDSI seeks to facilitate connections between researchers in diverse disciplines.  If you believe your proposal would benefit from a collaboration with a researcher outside of your department or discipline, we encourage you to seek collaborators through the Trust in Science channel on the HDSI Slack.  You can access our Slack site at https://bit.ly/2L5E3h5

Award Amount and Duration: The HDSI expects to make 3 - 5 awards of up to $100,000 (direct costs) for projects that are designed to be completed in a 12 to 18 month project period. 

Eligibility: Individuals who hold a faculty appointment at a Harvard school and who have principal investigator rights at that school are eligible to apply. (Harvard Medical School faculty must hold a faculty appointment with PI rights in one of HMS’s Quad-based, preclinical departments). 

Application Requirements: Please submit the following online here:

  1. Project Statement (2 page maximum) that addresses:

    1. The question or problem, and why it is important.

    2. The approach to be taken.

    3. The potential impact of the proposed work broadly and in the context of trust in science.

  2. Abridged CV or Biosketch (limit 2 pages)

  3. Budget Estimate (up to $100,000), with major divisions of funds listed as line items (personnel, equipment, etc.). School assessments and/or indirect costs should not be included in your budget (the DSI will arrange these with home schools separately). Examples of eligible expenses include: personnel (including postdocs, graduate students, undergraduate students), travel, and acquisition of datasets. The following expenses are NOT eligible for funding: faculty salary, graduate student tuition, subcontracts outside Harvard unless there is a clear scientific rationale for why the work needs to be done externally.

Reporting: Each awardee is expected to complete a short  interim report three months from the date funds are awarded, and a final report within 30 days of project completion.