Altmetrics ManifestoEdit
The Altmetrics Manifesto, published in 2010 by Jason Priem and collaborators, marked a watershed moment in how scholars and institutions think about impact. It argued that the influence of scholarly work could no longer be measured only by long-established conventions like citation counts in journals. Instead, the paper urged leveraging a wider set of signals generated by the online attention economy—social media mentions, bookmarks, downloads, blog discussions, and other forms of public engagement—to capture how research actually travels, is discussed, and becomes usable beyond the academy. In short, it proposed a shift from a closed, gatekeeper-driven metric system to an open, interaction-driven one that could reveal real-world consequences of ideas more quickly and more broadly. Altmetrics Manifesto Jason Priem
The manifesto did not claim to replace traditional bibliometrics. Rather, it framed a complementary approach: traditional citations measure scholarly tethering within academia, while altmetrics measure reach, uptake, and resonance in the wider world. This dual framework could, in theory, provide a more complete picture of value—especially for research that informs policy, industry, or public life but does not attract a steady stream of formal citations. The authors also highlighted practical benefits: faster feedback about which topics are gaining attention, greater visibility for researchers outside well-connected networks, and the potential to reward useful work that changes practice, rather than merely wins awards from specialized journals. citations peer review open data
The genesis of the Altmetrics Manifesto sits at the intersection of rapid digitalization and a desire for accountability in research funding and policy decisions. Proponents argued that online signals could democratize evaluation by surfacing practical impact that could be observed by practitioners, policymakers, and civil society, not just by editors and citation indices. They also emphasized openness: standardized data sources, interoperability among platforms, and the ability to trace the lineage of influence across disparate channels. Critics, however, warned that the new signals would be noisy, biased, and manipulable, inviting new forms of gaming and misinterpretation. The ensuing decades have seen a lively debate about whether these signals can or should be trusted as valid measures of quality or worth. Mendeley Twitter blog policy industry
Background and core ideas
What altmetrics are: signals drawn from online platforms that indicate attention, discussion, or dissemination of scholarly work. Examples include social media mentions, bookmarks in reference managers, downloads, coverage in news media, and citations in non-traditional venues. These signals are collected and aggregated to form a portrait of a work’s reach and resonance beyond academic libraries. altmetrics data sources
The epistemic argument: impact is not just about being cited by other scholars; it is about how work changes minds, informs decisions, or feeds application. By tracking diverse channels of engagement, researchers and funders can see whether knowledge is translating into practice or public discourse. This broader lens can be especially useful for applied fields, industry partnerships, and public policy. impact research impact
The methodological claim: altmetrics are a complement, not a substitute, for traditional metrics. Proper use requires field-specific baselines, transparency about data sources, and an acknowledgment of biases and gaps in coverage. The manifesto urged experimentation with multiple signals rather than a single, superficial headline number. scientometrics data ethics
The data ecosystem: the idea depended on the availability of open or accessible data streams, the ability to normalize across platforms, and the willingness of institutions to adopt these signals into evaluation workflows. This required collaboration among researchers, funders, librarians, and platforms. open data data interoperability
The manifesto and its core propositions
Diversifying evaluation: broaden the set of signals used to assess impact so that work with practical utility or public engagement is visible, even if it does not generate traditional journal citations. This aligns with a view that public usefulness matters as much as scholarly prestige. academic publishing public engagement with science
Timeliness and relevance: online signals often emerge quickly, offering faster feedback loops than annual or biennial review cycles. This can help researchers adjust directions, allocate resources, and demonstrate value to stakeholders in near real time. timeliness research funding
Complementarity and accountability: the manifesto framed altmetrics as a tool to augment, not supplant, peer-reviewed indicators. The goal is to improve accountability by revealing how work flows into practice, policy, and society at large. accountability policy
Practitioner-friendly signals: the emphasis was not on abstract prestige metrics but on signals that matter to practitioners, industry partners, journalists, and policymakers who use knowledge to solve problems. policy impact industry partnerships
Reception and debates
Support and adoption
A range of researchers and institutions welcomed the manifesto as a pragmatic recognition that scholarship exists in a public ecosystem. Universities began experimenting with altmetrics dashboards to supplement tenure and promotion discussions, funders explored altmetrics to gauge program-level outcomes, and publishers considered how these signals could inform dissemination strategies. The underlying idea—that impact can be visible in many venues beyond journals—resonated with the push toward more open, outcome-oriented science. funding universities
Criticisms and concerns
Data quality and coverage: not all fields or regions generate online attention at the same rate. Some disciplines, languages, and communities are underrepresented in the online signals that altmetrics collect, leading to biased or incomplete pictures of impact. This risk is a central concern for anyone relying on these metrics for accountability. bias field differences
Gaming and manipulation: like any metric tied to social platforms, altmetrics are vulnerable to manipulation, from coordinated campaigns to bots or paid promotion. Guardrails, transparency, and methodological safeguards are essential to prevent gaming from distorting evaluations. gaming bots
Noise versus signal: not every online mention or download reflects meaningful influence. Some attention may be frivolous, sensational, or detached from scholarly quality. Critics worry that a focus on visibility could skew research priorities toward topics with mass appeal rather than long-term value. signal vs noise
Privacy and consent: tracking engagement across platforms raises questions about user privacy, data rights, and the extent to which scholars consent to being measured by personal interactions on social media and other services. privacy ethics
Equity and representation: while altmetrics promise broader visibility, they can also reproduce existing inequities, privileging researchers with access to strong networks, language advantages, or connections to digital platforms. Proponents argue that these gaps can be mitigated with careful design, but the concern remains a live point of contention. equity language
Right-of-center perspective: controversies and responses
From a pragmatic, market-minded perspective, the altmetrics project is appealing because it aligns measurement with real-world influence and user utility. Advocates argue that:
Market signals can drive better allocation of resources, rewarding work that demonstrably informs policy, industry, or public life. When funded research yields tangible benefits, altmetrics help reveal that value even if it does not accrue in elite journals. policy impact funding
Decentralization reduces reliance on a small cadre of gatekeepers. A system that looks beyond traditional journals can accelerate innovation, lower barriers to entry for researchers outside traditional hubs, and encourage cross-disciplinary collaboration. gatekeeping innovation
Accountability through diversity of signals can complement traditional peer review. If used carefully, altitude signals can provide early warnings about misalignments between claimed results and real-world uptake, prompting timely corrective actions. peer review open science
Critics who frame altmetrics as a threat to standards sometimes label the signals as inherently suspect or as a proxy for popularity competition. Proponents respond that:
The solution is not to abandon rigor but to implement robust governance: field-normalized baselines, transparent methodologies, platform-level disclosures, and cross-scalar comparisons (individuals, labs, departments, and institutions). These safeguards can curb biases and gaming while preserving the benefits of broader visibility. data governance standardization
Concerns about representation can be addressed through multi-signal dashboards, field-aware weighting, and attention to context. If done intelligently, altmetrics can actually enhance fairness by surfacing work that would otherwise fly under the radar of traditional metrics. fairness metrics design
Why some critics insist on limiting or resisting these signals, and why their criticisms are considered overstated by others, is a matter of ongoing debate. The central item of contention is whether public attention and engagement should systematically influence resource allocation and reputation in science, and if so, how to prevent distortions while preserving incentives for impact-driven research. research policy evaluation
Implications for governance, funding, and practice
Research funding and career advancement: altmetrics can inform portfolio decisions by highlighting projects with practical uptake or broad public engagement, but should not be the sole criterion. Balanced assessment remains essential. tenure promotion
Institutional strategy: universities and research centers can use altmetrics to tell a story about societal value, collaborate with industry, and demonstrate accountability to taxpayers or funders. But they must guard against treating online metrics as the only truth about quality. universities open data
Platform responsibility: as the data sources, platforms play a crucial role in shaping what gets counted. Transparency about data collection, privacy considerations, and criteria for signal inclusion are necessary to maintain credibility. platform ethics data transparency
Global and cross-disciplinary considerations: researchers in some regions or fields may find altmetrics more or less useful depending on digital infrastructure and cultural practices around dissemination. Policies should be sensitive to these realities to avoid reinforcing existing disparities. global science science policy