Think Check SubmitEdit
Think Check Submit is a practical framework designed to help individuals assess the credibility of information before they share it or submit it to a journal, platform, or other public venue. In an age where stories can travel faster than evidence, the approach emphasizes personal responsibility and a reliance on established norms of sourcing, verification, and accountability. It sits at the intersection of information literacy and the culture of science communication and is intended to complement, not replace, traditional institutions like peer review and journalism.
Proponents argue that the framework strengthens trust in public discourse by encouraging diligence in evaluating authorship, sources, and evidence, while preserving the open channels through which ideas can be tested. By foregrounding clear questions about who authored information, what the source is, and whether claims are supported by credible data, it aims to reduce the spread of misinformation without shutting down debate. In this sense, it aligns with the broader goal of maintaining high standards in critical thinking and fact-checking within a free-access information environment.
The following sections outline how the framework is intended to function, what it asks people to consider, and how it has been received in practice. The discussion also considers criticisms and defenses, particularly those that arise in debates about information governance, platform moderation, and the balance between free inquiry and accountability.
Origins and purpose
Think Check Submit emerged as a collective effort among health information professionals and organizations seeking to improve the reliability of information circulating on health topics and beyond. The project provides an accessible set of prompts and a checklist meant to guide readers through a short due-diligence process before sharing content or submitting it for publication. The aim is to foster a habit of verification that can be applied across disciplines, from clinical research to journalism to everyday social media use. See Think Check Submit for the core materials and guidance, and note how the approach sits alongside broader efforts in information governance and reproducibility.
In practice, the framework is oriented toward sources that carry claims with real-world consequences—policies, medical guidance, or scientific conclusions—where faulty sourcing can lead to harm or misinformed decision-making. The emphasis on transparent authorship, traceable evidence, and accessible sourcing resonates with how many professions expect information to be produced and evaluated, including academic publishing and regulatory science.
Core principles and the checklist
Think Check Submit typically presents a simple, actionable set of questions meant to be quickly applied by individual readers, students, researchers, or professionals. The core elements include:
Author and sponsor: Is it clear who created the information and who stands behind it? Is there disclosure of affiliations, funding, or potential conflicts of interest? See authorship and conflict of interest for related concepts.
Purpose and audience: What is the intended goal of the content? Is the purpose clearly stated (inform, persuade, entertain), and is the target audience appropriate for the claims being made?
Evidence and sources: Are claims supported by credible evidence? Are the sources primary, transparent, and traceable? Is the information consistent with what is found in peer-reviewed literature and other reputable outlets?
Credibility and quality: Is the information produced by recognized, credible institutions or individuals with relevant expertise? Are there indicators of quality control, such as citations, retractions, or corrections?
Recency and updates: When was the information published or last updated? Is there an explicit note about the currency of the data, especially for rapidly changing fields?
Accessibility and reproducibility: Can others access the underlying data or sources? Is there enough detail to reproduce or verify the claims?
Reuse and permissions: If used as a basis for new work, are there licensing and attribution requirements? Is the information suitable for the intended use, including redistribution or adaptation?
Cross-checking and independent confirmation: Are there independent sources that corroborate the claims? How does the information compare with the consensus of experts in the field?
These questions are designed to be portable across contexts. They are not a guarantee of truth, but a structured way to reduce the likelihood that erroneous, misleading, or out-of-context information gets amplified. See information literacy and scientific integrity for related discussions.
How it operates in practice
For readers: The framework offers a quick mental checklist to use when encountering potentially important or controversial claims, whether in news media, academic articles, or online forums. It encourages skepticism of sensational claims and a preference for sources with clear documentation.
For authors and publishers: Think Check Submit serves as a reminder to provide transparent sourcing, disclosures, and up-to-date information. It can be used as a reference alongside editorial standards and publication ethics to improve the quality of submissions.
For educators and institutions: The approach can be integrated into media literacy curricula, helping students develop skills to differentiate high-quality information from filler or propaganda. It also reinforces the idea that responsibility for accuracy rests with both creators and gatekeepers in the information ecosystem.
For platforms: While the framework centers on the reader and writer, its principles feed into broader discussions about content moderation and the reliability of shared material. The goal is not to suppress ideas but to encourage responsible dissemination.
Reach, reception, and debates
Supporters contend that a disciplined, nonpartisan habit of verification strengthens public reasoning and protects institutions of knowledge from reckless claims. They argue that the framework preserves pluralism and free inquiry by emphasizing verifiable evidence rather than ideological conformity. Proponents also point out that it can be applied to everyday content, not just formal publications, helping to raise the standard of information across platforms and communities.
Critics, however, warn that any standardization of credibility can become a tool for gatekeeping. In practice, disagreements over what counts as credible evidence, what constitutes a legitimate source, or which institutions deserve trust can reflect broader cultural and political tensions. Some contend that high-profile campaigns of verification can be co-opted to shield established powers from scrutiny or to marginalize dissenting perspectives. From this perspective, the concern is not about encouraging due diligence per se but about who defines the norms of credibility and how they are enforced.
From a traditionalist or market-minded vantage point, the emphasis on institutional guarantees—such as peer-reviewed publication, transparent funding disclosures, and reproducible data—fits with a long-standing belief in the value of stable, accountable institutions. Critics of the more expansive critiques argue that even the best-intentioned gatekeeping does not guarantee accuracy and can slow innovation, especially when disciplines differ in their standards of evidence or when rapid decision-making is required.
Wider conversations in the information ecosystem have touched on whether the Think Check Submit approach adequately distinguishes between well-supported disagreement and outright misinformation. Proponents maintain that the framework is not a political cudgel but a practical tool that helps readers navigate a complex information landscape. They also emphasize that it does not replace professional oversight; rather, it complements it by guiding ordinary readers toward better reading habits and more careful submissions.
Controversies and defenses (from a stability- and accountability-focused perspective)
Gatekeeping risk: Some argue that such frameworks could be used to suppress controversial or unconventional ideas if those ideas lack conventional sponsorship or mainstream alignment. Defenders respond that the aim is not to censor debate, but to ensure claims are anchored in evidence, sources, and verifiable context.
Free inquiry versus control: Critics worried about overreach contend that verification standards may morph into de facto control over what counts as credible. The counterpoint is that accountability mechanisms are necessary to prevent the spread of harmful misinformation while preserving room for robust, well-argued dissent that is properly sourced.
Discipline-specific standards: Different fields have distinct norms for what counts as credible evidence. Critics say one-size-fits-all checklists ignore these nuances. Proponents argue that the core questions are adaptable and can be tailored to disciplinary contexts while preserving core expectations: provenance, evidence, and transparency.
Impact on platform dynamics: In the platform era, there is concern that verification tools could interact with moderation policies in unintended ways. Supporters emphasize that such tools should empower users to evaluate information themselves and should be used in tandem with, not as a substitute for, professional review and editorial judgment.
"Wider accountability" critique and response: Some opponents frame the framework as part of a broader movement to police speech. In response, advocates point to the non-punitive, educational aim of the questions, stressing voluntary use and the preservation of open discussion, while underscoring that accountability should apply to all information producers, not only those on the margins of public discourse.