Altmetrics

Citation metadata

Date: July 2015
From: Library Technology Reports(Vol. 51, Issue 5)
Publisher: American Library Association
Document Type: Article
Length: 17,349 words
Lexile Measure: 1710L

Document controls

Main content

Abstract: 

Library Technology Reports (vol. 51, no. 5) introduces the concept of altmetrics, including its relation to existing citation-based research metrics and to the larger academic community. Major altmetrics tools are presented and discussed, as well as social media sources that comprise the spectrum of altmetrics, and methods for evaluating new and existing metrics tools. Drawing on recent research and online resources within the field, the report outlines both the promises and major obstacles faced by the field of altmetrics. The report also explicitly explores the role of libraries in altmetrics, such as the ability of librarians to serve as facilitators and communicators within their institutions, and to provide education and support related to altmetrics and scholarly impact. Various tips and resources are highlighted for librarians and administrators looking to stay current with changes in this rapidly moving field.

Full Text: 
Contents

Chapter 1--Introduction to Altmetrics

Defining Altmetrics
Development of Altmetrics
From Bibliometrics to Altmetrics
Present-Day Altmetrics
Understanding Altmetrics
Recommended Readings
Notes

Chapter 2--Major Altmetrics Tools

Nonacademic Tools
Academic Tools and Peer Networks
Altmetrics Harvesting Tools
Evaluating Tools
Conclusion
Further Resources
Notes

Chapter 3--Issues, Controversies, and
Opportunities for Altmetrics

Controversies Surrounding Altmetrics
Opportunities Surrounding Altmetrics
The Future of Altmetrics: Standards and Institutions
Further Reading
Notes

Chapter 4--Altmetrics and the Role of Librarians

Library Involvement
Ways to Stay Current
Conclusion
Further Reading and Resources
Notes

Chapter 1

Introduction to Altmetrics

In today's modern era of analytics, electronics, and scholarly competition, metrics are an important part of the everyday lives and workflows of people across the higher education community. From researchers applying for federal grants to faculty members preparing their tenure and promotion files, metrics have become an increasingly visible part of how academics and administrators are expected, if not required, to talk about impact and value. However, just as what it means to do research has changed drastically over the last fifteen years with advances in information technology, so have the qualifications for what constitutes a useful impact metric begun to evolve and expand with changes in scholarly communication. Of these expansions, the most significant is arguably the development of altmetrics, which constitutes a strictly twenty-first-century approach to impact measurement that relies heavily on the connection between scholarly activity and the opportunities afforded by the Social Web.

In this Library Technology Report, we introduce the most important features of the current altmetrics movement, from its origins in scholarly communication and citation-based bibliometrics to its recent flourishing in partnership with academic innovators and a growing population of academic librarians. Within each chapter, we highlight key players and issues that have arisen in combination with the altmetrics movement, including the uncertainties and opportunities that have alternatively stymied and encouraged its acceptance in certain higher education circles. By providing the facts surrounding the growth and development of altmetrics, particularly as they overlap with the concerns of academic libraries, we seek to provide today's library leaders with the necessary context to make decisions and take actions pertaining to the future of this quickly changing field of research and practice.

We begin this first chapter with a review of the recent origins of altmetrics, as well as a look at how the approach of altmetrics relates to more established practices for measuring scholarly impact, such as citation-based bibliometrics.

Defining Altmetrics

Altmetrics as a term was coined in September 2010 by Jason Priem, a doctoral student at UNC-Chapel Hill's School of Information and Library Science (see figure 1.1). (1) A firm believer in the power of online scholarly tools to help researchers filter information and identify relevant sources, Priem was interested in identifying a set of metrics that could describe relationships between the social aspects of the web and the spread of scholarship online. With few terms available to encompass this diverse-yet-specific group of analytics, Priem decided to popularize one of his own making. The result, altmetrics, is a shortened version of the phrase alternative metrics, presumably because it offered scholars an alternative to metrics derived from a purely print-based understanding of scholarly research and communication.

For practical purposes, the best-known definition of altmetrics comes from Altmetric.org, a website set up by Priem and three of his colleagues in October 2010 in order to promote their more detailed Altmetrics Manifesto (see figure 1.2). On it, the altmetrics approach is described as "the creation and study of new metrics based on the Social Web for analyzing, and informing scholarship." (2) However, in the years following the release of this resource, new questions have arisen about exactly what this definition of altmetrics encompasses, and what it actually means to calculate altmetrics in different scholarly contexts. We will discuss these issues later, in the third chapter of this report.

In order to better understand the early history of altmetrics, we look now at a few of the more significant events leading up to its development, beginning with the changes in information technology and scholarly communication at work toward the end of the twentieth century.

Development of Altmetrics

As the definition of altmetrics makes clear, one of the first prerequisites for its development was the growth of the Social Web, or the part of the Internet focused on social relationships and activities.

Between the late 1990s and early 2000s, the texture of the Internet underwent a dramatic shift as innovative toolmakers began offering users more and more ways to create and share original, personal content on the web. Free online journaling platforms, such as LiveJournal (figure 1.3), led to an explosion in the number of blogs and bloggers, while early social networking sites such as MySpace and Friendster broadened the scope of online social sharing to include shorter updates, media, and more. By 2004, the year of the first Web 2.0 Conference, the Social Web had officially blossomed from a possible fad into a real and significant part of the Internet.

The technological changes of the late 1990s and early-to-mid 2000s were also important from the perspective of academia, although not entirely in the same ways. For instance, for the first time, researchers at colleges and universities were beginning to see the widespread availability of scholarship online. "Big Deals" made by librarians with certain scholarly publishers resulted in new electronic access to thousands of articles, often from journals previously outside of libraries' print collections. This sudden spike in the range and availability of electronic scholarly material quickly altered the ways that users searched for and found academic information. In response, most academic libraries continued to pursue bundled subscriptions to scholarly e-journals. However, at the start of the twenty-first century, mounting evidence began to suggest that such deals do little to solve the long-term problem of increasing costs for serials access.

In December 2002, at the height of the serials crisis, the attendees of a small conference in Budapest convened by the Open Society Institute released a short public statement, in which they proposed using the Internet to make research literature free for anyone to use "for any ... lawful purpose, without financial, legal, or technical barriers other than those inseparable from gaining access to the internet itself." (3) Later known as the Budapest Open Access Initiative, this powerful statement became a founding document of the open-access (OA) movement, for which many libraries and librarians have since become champions.

While the history of the open-access movement is too rich a topic to go into here, it is notable that its invention helped set the stage for the later development of altmetrics. By emphasizing the power of the Internet as a tool for research, the benefits of rapid research discovery for purposes of innovation, and the positive consequences of openly sharing scholarly content with the public, OA helped encourage deeper connection between libraries, scholars, and makers of alternative platforms for scholarly publishing and networking. Evidence of this can be seen in the type of online scholarly venues that began to grow and thrive in the early 2000s following the articulation of open access, including the Public Library of Science (figure 1.4) and arXiv (figure 1.5), both of which endorse OA values while tracking interactions between objects and users online--that is, alternative impact metrics.

Perhaps it is for the combination of these various reasons that the mid-2000s saw the first true flourishing of both Web 2.0 and "open values" across the spheres of both academia and the general public. The year 2004, for instance, saw the release of Facebook, a social networking tool aimed originally at college students, which today sees 864 million daily active users. (4) In the same year, academic users of the Internet gained access to the citation-sharing tool CiteULike, which PhD candidate Richard Cameron developed based on the social bookmarking model popularized by Web 2.0 tool Delicious. Gradually, this cross-pollination of social principles and "serious" user interests resulted in the release of a flurry of game-changing tools for both scholars and professionals alike, including Twitter (founded 2006), GitHub (founded 2007), and Academia.edu, Mendeley, and ResearchGate (each founded in 2008). In chapter 2, we will look more closely at each of these tools and more, as well as the ways in which they variously embrace the tracking of impact through metrics.

All this is to say that, by the time altmetrics was officially coined in 2010, many events had already taken place within both general society and academic culture to make the idea of a set of web-based metrics for measuring impact a tempting proposition--not just for scholars, but for publishers, toolmakers, and librarians, too. However, the "alternative" positioning of altmetrics, specifically in relation to citation-based bibliometrics, created an immediate set of obstacles for the movement, obstacles that the field of altmetrics has had to work hard to overcome ever since. For this reason, we take a moment here to briefly examine the relationship between bibliometrics and altmetrics, including how each has been received by proponents of the other over time.

From Bibliometrics to Altmetrics

In contrast to altmetrics, which has emerged as a fully articulated idea only within the last five years, bibliometrics has been around as a formal concept since the early 1960s and was originally defined as the set of quantitative methods used to analyze scholarly literature.

Best known for its inclusion of metrics such as Journal Impact Factor (see figure 1.6), which was proposed as early as 1955, bibliometrics is traditionally concerned with analyzing scholarship through the counting and tracking of journal article citations--which themselves tend to lean toward citations of other journal articles. Because of this, the major providers of bibliometrics tend to be closely connected to, or synonymous with, established indexers of scholarly articles, such as Thomson Reuters (Web of Science, Journal Citation Reports, Book Citation Index, Data Citation Index), Scopus (SCImago Labs [figure 1.7], Eigenfactor.org), and the increasingly popular Google Scholar (Google Scholar Profiles, Google Scholar Rankings).

These citation-based tools and metrics have come to dominate the scholarly impact landscape, particularly in the STEM fields, where article-based productivity metrics are more commonly accepted for purposes of evaluation and benchmarking. However, by the same coin, for scholars in areas that emphasize the production of scholarly monographs over scholarly articles, the field of bibliometrics has garnered significantly less attention and clout. The same can be said for the use of bibliometrics among individual scholars whose research portfolios go beyond the bounds of traditional citation, such as those in the fine arts or academic departments with strong ties to professional practice.

While the analysis of print-based journal citations has always been the bread and butter of the bibliometrics world, this is not to say that the landscape of bibliometrics hasn't shifted noticeably with innovations in the technologies that drive scholarly communication. Even before the rise of altmetrics as a buzzword, bibliometricians and bibliometrics-producing organizations were clearly very interested in how to incorporate both the web and broader forms of scholarly output into their quantitative analyses; hence the occasional appearance of webometrics, cybermetrics, and other portmanteaus ending in -metrics in the pre-2010 era literature.

Thus, although the field of altmetrics may have positioned itself originally as an "alternative" to the filtering systems offered up by print- and citationbased bibliometrics, its core interest remains largely congruent with that of bibliometrics in that both are essentially interested in what can be learned from the quantitative analysis of information related to scholarly output and publication. Such similarities have not, however, prevented occasional perceivable periods of tension between the two fields' respective followers. A number of bibliometrics proponents, for instance, have expressed public skepticism about altmetrics based on their seeming rejection of citation-based standards for tracking and identifying impactful scholarship. In the same vein, altmetrics advocates have occasionally submitted statements that could be interpreted as denigrating bibliometrics in general, rather than the specific monopoly of bibliometrics indicators like Impact Factor--a monopoly that had already generated substantial controversy within the larger academic community.

An example of this tension can be found in the recent online back-and-forth between Jeffrey Beall, author of a well-known blog that publishes the names of predatory open-access publishers, and the team behind the altmetrics product Impactstory, who often respond to criticism of altmetrics via their blog. Writing in a blog post published in August 2013, Beall calls the idea of altmetrics "ill-conceived" and expresses the opinion that article-level metrics "reflect a naive view of the scholarly publishing world"--that is, one that does not properly recognize efforts to game the system by unethical authors, publishers, and readers. (5) In response, former Impactstory team member Stacy Konkiel published a post on Impactstory's own blog in September 2014, in which she called Beall's comments "ill-informed" and refuted numerous assumptions about altmetrics taken from Beall's 2013 post. "There's no denying that 'gaming' happens, and it's not limited to altmetrics," she writes at one point, before launching into a more detailed explanation of how altmetrics providers deal with efforts at fraudulent activity. (6) Konkiel also refutes Beall's claim that, as a set of metrics that can be influenced by the public, altmetrics cannot be taken as serious means to gauging article quality. "The point of altmetrics isn't to measure quality," she explains. "It's to better understand impact: both the quantity of impact and the diverse types of impact." (7)

We will return to this discussion of the controversies and criticisms that have surrounded altmetrics in chapter 3 of this report. However, it should be noted that flare-ups between altmetrics and bibliometrics have become noticeably rarer in the last year or two. This change, while not yet a sign of altmetrics' full higher education acceptance, is certainly an indication of its transition from fringe topic into mainstream academic conversation.

Present-Day Altmetrics

Looking at the pace and progress of altmetrics in the present day, it becomes hard to imagine that the field won't have at least some place in the foreseeable future of scholarly research metrics. But is this acknowledgement the same as saying that the field of altmetrics has answered the necessary questions to deserve a stable spot in the long-term lineup of recommended practices for measuring scholarly impact? The anxiety of librarians and library administrators around how to present, contextualize, and, indeed, invest in altmetrics is especially high and in need of relief in the form of up-to-date information.

On the one hand, as we will further discuss in chapter 2, altmetrics as a movement has certainly "grown up," to borrow a phrase from Martin Fenner, the Technical Lead for the Public Library of Science's (PLOS) Article-Level Metrics project and the recent editor of a special issue on altmetrics for Information Standards Quarterly (see figure 1.8). (8) The initial period of uncertainty over whether the collection of data surrounding web-based interactions with scholarly objects would be of serious value to any academic parties has given way to a new phase of practical curiosity, mostly in light of the interest expressed in altmetrics by researchers across the disciplines, as well as influential funding groups like NSF and NIH. Likewise, the producers of alternative metrics have significantly matured over the last two years, moving from a handful of one-man pet projects like ReaderMeter--an early altmetrics tool that considered impact solely from the perspective of Mendeley Readership metrics--to a lively marketplace of sleek systems and sophisticated user networks, most of which calculate their metrics using a variety of sources or methods. The decision on behalf of major publishers like Elsevier and EBSCO to acquire altmetrics-focused startups (Mendeley and Plum Analytics, respectively) is another tick mark in favor of altmetrics' eventual stability and wider acceptance as a supplement to bibliometrics.

On the other hand, even if the altmetrics movement is no longer in its infancy, one might be hardpressed to place it beyond the phase of toddlerhood. After all, change continues to be rampant throughout the altmetrics community, and nowhere more so than in its business quarters. Major altmetrics harvesters may suddenly decide to rebrand themselves, as in the 2012 case of Impactstory (formerly Total-Impact). Experimental partnerships between altmetrics providers and publishers have also led to the unexpected cropping up of altmetrics in new online spaces overnight, such as the adding of metrics from Altmetric .com to some (but not all) Scopus articles in 2012, (9) and again to all online Wiley journals in 2014. (10)

Similarly, while the acquisition of altmetrics providers by for-profit publishing companies like Elsevier and EBSCO has buoyed the reputation of altmetrics for some parties, it has been a cause for concern for others, who see it as a sign that altmetrics may lose its connection to values of openness and online community. Thus, if altmetrics has grown up in the last two years, it has grown up via growth spurt--a pace that has come with a good deal of risk and that will necessitate a slowdown that still sits somewhere on the horizon. The efforts of groups like the National Information Standards Organization (NISO) to create new conversations around altmetrics standardization are part of this next stage of development, but participation by everyday users, researchers, administrators, and librarians is equally essential to success.

In summary: Between our present place and that horizon sits a good deal of opportunity, but also a great deal of work, which we will further discuss in chapter 4 of this report, along with the role of libraries, library liaisons, and library administrators in shaping the future of altmetrics.

Understanding Altmetrics

In this chapter, we introduced the concept of altmetrics, from its recent origins in scholarship and technology to its evolving position next to other quantitative fields like bibliometrics, up to the present day. In the next three chapters of this report, we will significantly elaborate on this portrait by detailing the major tools and provider types related to altmetrics (chapter 2); the issues, controversies, and opportunities that have arisen during the growth of altmetrics as a movement (chapter 3); and the various ways that academic libraries and librarians have become involved, or are positioned to become involved, in the next phase of the field's development (chapter 4).

Recommended Readings

Fenner, Martin, ed. "Altmetrics." Special issue, Information Standards Quarterly 25, no. 2 (Summer 2013). www.niso.org/publications/isq/2013/v25no2.

A well-scoped special issue of NISO's print and electronic magazine, Information Standards Quarterly, focused on recent developments in altmetrics as of Summer 2013. Articles include reflections on the consumption of article-level metrics (ALMs), the potential use of altmetrics by educational institutions, and other practical applications of altmetrics.

Piwowar, Heather, ed. "Altmetrics: What, Why, and Where?" Special issue, Bulletin of the American Society for Information Science and Technology 39, no. 4 (April/ May 2013). https://www.asis.org/Bulletin/Apr-13/Apr May13_Piwowar.html.

A second altmetrics-focused special issue from 2013, this time from the ASIS&T online Bulletin. Edited by altmetrics leader Heather Piwowar, this issue contains several useful articles on altmetrics written by toolmakers in the field, as well as a valuable discussion of the overlap between altmetrics and institutional repositories.

Priem, Jason, Dario Taraborelli, Paul Groth, and Cameron Neylon. "Altmetrics: A Manifesto." Altmetrics .org. Last modified September 28, 2011. http://altmet rics.org/manifesto.

The original and still the most widely recognized statement about altmetrics available online. Links and tool references on other pages of the site are mostly out of date, as the four founders have since moved on to other, larger altmetrics projects.

Notes

(1.) Jason Priem (@jasonpriem), message to Twitter, September 28, 2010, https://twitter.com/jasonpriem/ status/25844968813.

(2.) Jason Priem, Dario Taraborelli, Paul Groth, and Cameron Neylon, "Altmetrics: A Manifesto," Altmetrics.org, last modified September 28, 2011, http:// altmetrics.org/manifesto.

(3.) "Read the Budapest Open Access Initiative," Budapest Open Access Initiative website, February 14, 2002, www.budapestopenaccessinitiative.org/read.

(4.) Based on data released for September 2014. See "Company Info," Facebook Newsroom, accessed December 19, 2014, http://newsroom.fb.com/ company-info.

(5.) Jeffrey Beall, "Article-Level Metrics: An Ill-Conceived and Meretricious Idea," Scholarly Open Access (blog), August 1, 2013, http://scholarlyoa .com/2013/08/01/article-level-metrics.

(6.) Stacy Konkiel, "What Jeffrey Beall Gets Wrong about Altmetrics," Impactstory Blog, September 9, 2014, http://blog.impactstory.org/beall-altmetrics.

(7.) Ibid.

(8.) Martin Fenner, "Altmetrics Have Come of Age," Information Standards Quarterly 25, no. 2 (Summer 2013): 3, www.niso.org/apps/group_public/down load.php/11270/Fenner_Editor_Letter_isqv25no2.pdf.

(9.) "Altmetric for Scopus," Elsevier Author's Update, last modified September 1, 2012, www.elsevier.com/ authors-update/story/impact-metrics/altmetric -for-scopus.

(10.) Graham Woodward, "Altmetric is Now On Board for All Wiley Journals," Wiley Exchanges Blog, last modified July 8, 2014, http://exchanges.wiley.com/ blog/2014/07/08/altmetric-is-now-on-board -for-all-wiley-journals/.

Chapter 2

Major Altmetrics Tools

The altmetrics landscape is largely influenced not only by the thought leaders and outspoken critics and promoters, but also by the very tools that are used to produce, aggregate, and contextualize the raw data that comprises altmetrics data. In bibliometrics, the vast majority of available data is produced by a very small number of providers, mainly through costly library subscriptions. However, with altmetrics, usable data can be generated or harvested from a wide variety of sources, with different cost structures, accessibility levels, and intended audiences and purposes.

There are many reasons for the dichotomous approach between bibliometrics and altmetrics. One big reason is the very nature of the metrics themselves--since bibliometrics are based on journal articles, the big providers are concerned with indexing these articles, creating links between their citations, and using this data as the base for the calculated metrics. Since the field of altmetrics has no strictly set definition or set of defining metrics, an individual altmetric can be generated from a large variety of online tools, including social media websites, information-sharing sites, online scholarly networks, and other tools used to create, collect, share, organize, and manage many types of information. Some tools are specifically created for the purpose of altmetrics, while many take advantage of existing data generated for both scholarly and nonscholarly purposes. Likewise, some are freely available online, while others require a subscription or registration to access and are variously funded by grants, advertisements, companies, or the aforementioned subscriptions.

Given all of this diversity, it's not easy to keep track of all of the sources and tools that can be included in the large altmetrics umbrella. In this chapter, we will take a look at many of the tools that comprise this increasingly diverse landscape and discuss methods for evaluating new and existing tools as they continue to evolve.

Nonacademic Tools

We begin our tour by focusing on tools that define today's online user experience--websites, including social media tools, visited or used by, well, just about everyone. None of these sites was developed for the purpose of altmetrics or even with a particularly academic focus. Nonetheless, they can give us some insight into the impact of scholarship, particularly as it affects the public.

Facebook

Perhaps the best known of all social media tools, Facebook is used by individuals, groups, businesses, and other organizations to connect and share information of all kinds, including photos and videos. Sometimes, Facebook is even used to share academic information like journal articles, video presentations, and blog posts. The number of times a URL has been shared or Liked can be counted and reported by outside tools such as altmetrics harvesters, which we will discuss later in the chapter. These metrics can be used as an early indicator of interest or attention regarding any scholarly contribution that can be traced to a URL.

Twitter

Twitter serves a purpose very similar to Facebook's in that it connects individuals, businesses, and other entities for the purpose of sharing information, including photos and videos. However, Twitter's most distinguishing feature is that information bites, or Tweets, are restricted to 140 characters. Twitter also seems to be used more often for academic purposes, with people and organizations from publishers to individual journals to editors, researchers, and other academic individuals and entities widely represented. As on Facebook, when a URL is Tweeted or Retweeted, the number of Tweets can be counted, as well as the total reach of those Tweets--that is, the total number of Twitter users that follow everyone who has Tweeted the URL, meaning that they may have read the Tweet or clicked on the URL.

YouTube

YouTube is a popular video-sharing website where individuals and entities can create a YouTube account, allowing them to upload videos, subscribe to other individuals' video feeds, and comment on or Favorite a video. However, many videos are discovered by users through YouTube search, Google search, or the sharing of YouTube videos on social media sites and elsewhere. Metrics include the total number of views for a video, along with the number of comments and Favorites that a video has received. Videos can serve a variety of academic purposes, from the videotape of a lecture to a video methodology demonstration, or as a supplement to published research. The number of views or subscribers can demonstrate the relative interest in the videos or account. YouTube metrics are particularly useful for things like conference presentations, an area of scholarship that is often lacking in useful metrics.

Amazon

Amazon may not seem like an intuitive addition to the list. Amazon's main function is to buy and sell all kinds of goods, but it first started in 1995 as an online bookstore of sorts before expanding into other types of goods. Amazon still enjoys heavy revenue from its print and e-book holdings, with over $5 billion earned from books alone in 2013.1 Amazon provides a Best Sellers Rank for all books on its website, as shown in figure 2.1--that is, how often a book is purchased as compared to other books in the same category. This can demonstrate overall interest in the book, since there's no way to know who, exactly, might be buying the book (or for what purpose). Since Amazon users can also leave a rating and a review for any good, Amazon can also serve as a place to retrieve overall ratings and book reviews, keeping in mind that Amazon ratings and reviews can be added by any Amazon user for any reason and may reflect aspects of the buying process or impressions of the book rather than a reasoned critique of its contents.

Goodreads

Like Amazon, Goodreads can give us metrics only for a specific type of scholarship, that is, books. However, unlike Amazon, which gives us sales metrics, Goodreads can tell us self-reported readership metrics (see figure 2.2). Goodreads is a website and mobile app designed as a sort of "online bookshelf" for readers where they can keep track of books read, rate them, and look for book recommendations from other Goodreads readers. Another similarity to Amazon is the ability to retrieve

the overall rating and book reviews from Goodreads members, keeping in mind again that the reviews may be coming from a diverse pool of readers.

SlideShare

As we move down the list, we're slowly branching away from "tools everyone uses" to "tools used more often by academics," but SlideShare is the first listed tool that can count academics as one of the primary, but not exclusive, users of the tool. On SlideShare, users can upload a "slidedeck," or series of slides, like those from PowerPoint or other similar programs. Other users can follow a user, receiving notifications when that person uploads new presentations. Slidedecks are searchable by keyword or by user-input tags. Metrics include total number of views, Favorites, comments, and downloads, and users can access detailed metrics for each slidedeck, including number of views over time, as shown in figure 2.3. As with other sources, metrics can hint at overall interest in a presentation but cannot differentiate between academic interest and interest from the general public.

GitHub

GitHub is a useful website for anyone who creates programming code because it allows individuals to upload code, collaborate on code with others, and freely share code with others. In turn, GitHub tracks watchers, collaborators, and "forks." A fork is when someone copies code to develop and use for their own purposes, similar to creating a derivative work from a Creative Commons-licensed work. For programmers, this represents one of the only ways to track the impact of written code since citations are not easily trackable within coding. However, since program coding spans academic, business, and other realms, these metrics can show the impact of a code only on other coders, and not necessarily within academia.

Academic Tools and Peer Networks

The following are online tools used for organizing and sharing information, and each generates some type of metric that can be considered a type of altmetric. The main difference between these tools and those in the previous category is that these tools have been created for an academic audience, making academics the core user base for them. Because of this, the metrics generated from these tools can tell us more about the scholarly impact of contributions like journal articles. However, adoption of these tools throughout academia can vary widely, as their features may appeal to some disciplines more than others. These limitations should be kept in mind when using altmetrics information from these tools to portray the impact of a work, particularly when directly comparing works from different disciplines, an issue we will cover in greater detail in chapter 3.

Institutional Repositories

Institutional repositories (IRs) are familiar to many academic librarians since libraries are often responsible for the creation and maintenance of their institution's IR. But while many librarians are familiar with the role IRs play in contributing to open access, fewer are familiar with the role they play in the production of altmetrics. Many IRs contain metrics about the repository's artifacts such as views and downloads. These metrics can also serve as a powerful incentive for researchers to place their artifacts in the repository. Stacy Konkiel, former scholarly communications librarian, has written and presented extensively on the subject of IRs and altmetrics. (2)

CiteULike

CiteULike is a social bookmarking website specifically designed for researchers to save and organize journal citations into their personal libraries. These libraries can be set to be viewed publicly or for private viewing. Metrics can then be generated based on the number of public CiteULike libraries that contain a particular article. Since private libraries can't be viewed and relatively little is known about the CiteULike user base, these metrics are best when compared to those of other similar articles, though any metric can show a level of interest in the article.

CiteULike

www.citeulike.org

Mendeley

Like CiteULike, Mendeley is a free citation manager, helping researchers save and organize citations and PDFs. Users must register for an account online before downloading the Mendeley desktop program or using its online tools for citation management. However, Mendeley also hosts a social media component through its website by integrating the ability to follow individuals, join groups, and browse articles by discipline. The number of Mendeley users who have saved an article to their citation library is tracked, along with some demographic information about those users, as figure 2.4 demonstrates. These metrics are publicly available, meaning that they can be retrieved and analyzed by other tools. Having detailed demographics related to the metrics helps move the generated metrics from "someone is interested in this work" to "faculty and researchers in specific areas are interested in this work." Recent studies have shown a modest correlation between Mendeley users and later citation counts, meaning that this particular metric serves as a decent early indicator of scholarly impact, a point discussed in more detail in chapter 3.

Mendeley

www.mendeley.com

Academia.edu

Academia.edu is our first example of a "closed" peer network system. As on Mendeley, researchers can create a free profile and upload citations and full-text works, follow other authors, and track their usage metrics over time. However, unlike Mendeley, this information is available only to the individuals who have registered for an account so that it's closed to other tools, which are unable to retrieve these metrics. Nonetheless, these metrics can show interest in works over time, and Academia.edu remains a very popular research network for many researchers across many disciplines.

Academia.edu

www.academia.edu

ResearchGate

ResearchGate is a closed peer network system designed for researchers in the sciences, with metrics accessible only to its users. After registering for a free account, ResearchGate users can upload their citations and fulltext articles and get metrics for views, bookmarks, and downloads. Additionally, ResearchGate produces an author-level metric, the RG score, which aims at approximating the level of influence the user has within ResearchGate. The RG score is one of the only altmetrics scores whose primary focus is to measure author-level impact (albeit limited to impact within the ResearchGate system)--that is, a metric that is derived from the sum of scholarly contributions, rather than metrics for individual contributions (like journals), which are then summated for an individual author.

ResearchGate

www.researchgate.net

Social Science Research Network (SSRN)

The Social Science Research Network is one of the oldest peer networks, having been around in some form since 1994. However, SSRN is known primarily for allowing users to share pre-publication versions of articles, as well as white papers. Like the other peer networks detailed above, registration is free, and authors can add their own papers and retrieve metrics for those papers. However, since it focuses on articles that have yet to be published, SSRN can be useful in gathering early metric indicators, such as views and downloads, prior to the publication of an article.

Social Science Research Network

www.ssrn.com

Altmetrics Harvesting Tools

This final category of altmetrics tools includes tools that are most commonly associated with altmetrics because they are primarily concerned with harvesting, or gathering, altmetrics from many sources, including many of the sources detailed above. More importantly, these sources not only harvest altmetrics, but also work to contextualize the data in meaningful ways. This helps to provide a more in-depth understanding of what altmetrics can actually say about a scholarly work, particularly as it compares to similar works. Each tool has different features, strengths, and weaknesses, and they all serve similar but distinct purposes with different intended audiences.

Altmetric

The London-based company Altmetric provides a series of tools, all under the Altmetric banner, that increase in complexity from a tool designed to generate altmetrics for a single journal article to a tool that summates and compares altmetrics at the institutional level. However, each tool is built on altmetrics that are harvested and contextualized from the same sources, many of which are detailed above. However, all metrics are derived from journal articles only--more specifically, journal articles with a retrievable DOI, PubMed ID, or arXiv ID with "friendly metadata." This essentially limits the content for which the Altmetric tools can pull data to only those journal articles that it can correctly identify.

Altmetric

www.altmetric.com

With these limitations in mind, Altmetric is still able to pull together some powerful altmetrics data, starting at the individual article level with its bookmarklet.

ALTMETRIC BOOKMARKLET

The Altmetric Bookmarklet is a bookmarklet that integrates with Chrome, Firefox, or Safari to provide altmetrics from a journal article's website. The bookmarklet web page walks through the steps to install and use the bookmarklet. Once it is launched, the signature "Altmetric donut" is displayed, along with the "Altmetric score," some basic altmetrics, and links to more information at the bottom, as shown in figure 2.5. The colors in the donut indicate the altmetrics source (Twitter, Facebook, Mendeley, etc.), and the Altmetric score in the middle shows the level of attention the article has received in one unified score as measured by the article's altmetrics interactions. (3) The higher the score, the greater the level of attention according to Altmetric's calculations. These numbers can, in theory, be directly compared between different journal articles.

Altmetric Bookmarklet

www.altmetric.com/bookmarklet.php

Clicking for more details allows the user to view the individual sources that make up the altmetrics displayed, as well as providing some key contextual information. The Score tab gives the more detailed analysis of the Altmetric score, along with ranked and percentile comparisons for the score (see figure 2.6).

Similar to the Score tab, the other tabs within the Altmetric bookmarklet break down the altmetrics data into finer detail, including individual Tweets, Facebook posts, and so on, that are included in the total for that source. This level of detail is an example of the high level of accessibility and openness prominent among altmetrics tools, a concept we'll return to in chapter 3.

ALTMETRIC BOOKMARKLET INTEGRATIONS

While the bookmarklet works well as a stand-alone product for use by individuals on their Internet browsers, the same functionality has also been incorporated into an increasing number of other tools, providing seamless altmetrics data within those tools. Notable examples include Altmetric's integration within individual journal articles in Scopus, integration with institutional repositories such as DSpace, and integration with journal articles through specific publishers such as SAGE, HighWire, and Nature Publishing Group. These collaborations give increased exposure to Altmetric and, more generally, to altmetrics data, and we expect these types of collaborations to continue to grow in the future.

ALTMETRIC EXPLORER AND INSTITUTIONAL

Altmetric not only provides altmetrics data at the individual journal article level, but it also has two products, Explorer and Institutional, that provide summaries of this data at higher levels of evaluation--that is, they allow an individual to view altmetrics data for many journal articles, grouped by authors or by source (journal). While Explorer and Institutional have slightly different interfaces, due to the slight variations in audience, they both allow for more meaningful analysis and comparisons of the altmetrics. Furthermore, this data can be filtered and sorted in many different ways, allowing for a variety of analyses to take place.

Explorer is targeted toward publishers, librarians, and authors, while Institutional is (not surprisingly) targeted toward institutions and groups, but each provides a similar service. Explorer emphasizes use of the Altmetric donuts for individual article comparisons, while Institutional favors a less journal-centric and higher-order view (see figure 2.7).

Impactstory

Impactstory (formerly known as Total-Impact) was cre ated to help researchers demonstrate research impact using altmetrics. Accordingly, Impactstory is designed for use by these researchers (rather than departments or institutions) by collating and contextualizing a researcher's scholarly outputs within that person's Impactstory profile page. This profile page can then be used in any situation in which a researcher needs to demonstrate impact, such as grant applications, tenure, or promotion, or as part of a review.

Impactstory

www.impactstory.org

Although Impactstory originally started with funding obtained through several grants, the company has recently made the decision to implement a modest fee for its users ($45 a year, though fees may be waived based on financial need). However, new users can sign up for a seven-day trial to set up a profile and determine whether it's worth the cost for them.

Once a researcher has created an account, that person can add scholarly works manually or can import works from SlideShare, ORCID, Scopus, and more. Works are sorted into types of work, and the user's home page will display an overview of all altmetrics, along with selected works highlighted in the center of the page, as shown in figure 2.8.

Impactstory will then display all available altmetrics for these works using badges like Discussed, Saved, and Viewed. Like other altmetrics harvesters, Impactstory excels in providing contextualized metrics based on raw altmetrics data it collects from other sites. If any metric is higher than 75 percent of comparable works, the badge will be designated as "Highly," such as "Highly Viewed." Badges can be clicked on for more detail about the comparison (see figure 2.9 for an example). As explained on the website, Impactstory will compare an article based on its primary reader group on Mendeley. (4) So if an article is read primarily by people affiliated with information science, all metrics will be compared to other information science articles published that same year.

PlumX

PlumX was created by two entrepreneurs to help researchers and institutions meaningfully measure and engage with generated altmetrics data, and it serves as a direct competitor to Altmetric Institutional. Within PlumX, altmetrics are gathered from a variety of sources, including EBSCO abstract views and downloads (which are exclusive to PlumX, since the company, Plum Analytics, was acquired by EBSCO in January 2014). This data is gathered for all researchers and the scholarly works (or "artifacts," as PlumX calls them) that are entered for the researchers. The function of adding works for scholars is similar to that for Impactstory, as researchers and artifacts can be added by DOI, URL, or PubMed ID or uploaded from other systems such as Web of Science or Scopus. Once researchers and their artifacts have been added, researchers can be organized into groups (e.g., departments within an institution or labs within a research facility). Altmetrics data can then be viewed at the institutional level, as demonstrated in figure 2.10, as well as the group, author, or individual artifact levels.

PlumX

www.plu.mx

One of the more unique forms of engagement that PlumX provides is through the Plum Print. This feature is designed to allow users to view types of engagement with altmetrics through a visual display--for example, degree of social media interaction versus citations. The larger the branch of the sunburst, the greater the number of altmetrics in that category, as shown in figure 2.11.

Kudos

Kudos is a relatively new online platform for researchers designed to help them better market their research and track their impact over time. Through Kudos users can associate their published articles with supplemental information and other files like videos, data files, or other articles in one Kudos article web page, as shown in figure 2.12. Users can then track how the sharing of these Kudos web pages affects metrics like views and downloads (see figure 2.13).

Kudos is free for users and is supported by publishers and institutions, which pay a fee for access to their own metrics. Kudos imports and displays metrics from a variety of sources, including data from Altmetric and Thomson Reuters (for Web of Science's times cited), along with tracking the number of views of the researcher's Kudos web pages.

Evaluating Tools

Since the field of altmetrics is still emerging, change and experimentation are currently the only norm upon which we can rely, making an up-to-date introduction to the tools that make up the altmetrics field virtually impossible. What doesn't change, however, is a series of core values and priorities that good tools can bring to this evolving environment. With that in mind, it's important to be able to not only be familiar with current tools, but also to be able to effectively evaluate new tools from an altmetrics perspective as they are added to the metrics tool landscape or evolve from their current iteration. Here are some factors to consider when assessing potential altmetrics tools.

Audience

Some tools are targeted toward the individual researcher, while others are designed for institutional use. Identifying the target audience will also help identify the intended uses, including the most likely scenarios in which this tool could be useful to your library or its users.

Cost

While the cost structure is usually relatively simple to determine, it is worthwhile to dig deeper and learn a bit more about the financial environment under which this tool operates. This will help identify tools that may implement a subscription or may be more likely to be bought by a larger company in the future.

Metrics and Accessibility

Understanding a bit about the metrics within the tool is important since metrics can tell different stories regarding research impact. For example, whether a tool is generating metrics for an abstract view versus a full-text article view versus a full-text article download can greatly change the understanding of the metric and what it says about the article itself.

Accessing the metrics largely relies on whether the tool is an open tool or a closed tool--that is, whether registration and login are required to access personal metrics or whether metrics can be retrieved by anyone, including altmetrics harvesting tools. Accessibility can ultimately limit the success of the tool, particularly due to "sign-up fatigue" or the reticence to register and manage upkeep for tool after tool. If metrics can be harvested and aggregated by one tool, it all but eliminates the need for management within the tool that creates the metrics.

Unique Features

Finally, learning more about what this tool can provide for the intended user can determine its relative usefulness for that user. In other words, as the business saying goes, have they "built a better mousetrap" that would make this tool useful or appealing or improved existing tools?

Conclusion

The altmetrics landscape is comprised of a diverse set of tools and resources that can be used to measure a variety of ways in which researchers and other people are viewing, saving, and interacting with scholarly content. But, like many 21st-century innovations, the tools themselves emerge, evolve, and disappear rapidly, making it difficult to stay on top of the most recent developments. Using evaluative criteria can help those working with altmetrics better understand the benefits and downsides of using data generated from any given source. However, understanding the central altmetrics tools is only part of the landscape equation. In the next chapter, we will take a look at some of the broader topics surrounding altmetrics, including barriers to broader acceptance for altmetrics, the impact of metrics on different scholarly disciplines, and future directions for altmetrics.

Further Resources

Barker, Kimberley R., and Andrea Horne Denton. "Altmetrics: The Movement, the Tools and the Implications." April 16, 2014. www.slideshare.net/CMHSL/ altmetrics-2014415slideshare.

This presentation, from two health science librarians at the University of Virginia, does a nice job of summarizing the background of altmetrics and takes a look at many of the metrics and tools, with lots of pictures and descriptions. This presentation also serves as an excellent example of a librarian presentation, one of the many ways in which librarians can be involved with altmetrics, as we'll discuss in greater detail in chapter 4.

Chin Roemer, Robin, and Rachel Borchardt. "From Bibliometrics to Altmetrics: A Changing Scholarly Landscape." College and Research Libraries News 73, no. 10 (November 2012): 596-600. http://crln.acrl .org/content/73/10/596.full.

This article, written by the authors of this report, although now slightly outdated, gives a nice, succinct summary of currently available metrics and tools within the field of altmetrics as well as bibliometrics.

Fenner, Martin. "Altmetrics and Other Novel Measures for Measuring Scientific Impact." In Opening Science: The Evolving Guide on How the Web is Changing Research, Collaboration and Scholarly Publishing, edited by Sonke Bartling and Sascha Friesike. Springer, 2014. http:// book.openingscience.org/vision/altmetrics.html.

Fenner leads the Article-Level Metrics (ALMs) initiative at PLOS and writes frequently on the subject of altmetrics. This online book chapter does a great job of covering altmetrics sources and tools, as well as helpful terminology, provides a research summary, and more. The entire book, Opening Science, is open to comments and revisions, so the chapter is likely to change over time.

Notes

(1.) Jeff Bercovici, "Amazon vs. Book Publishers, by the Numbers," Forbes, February 10, 2014, www .forbes.com/sites/jeffbercovici/2014/02/10/ amazon-vs-book-publishers-by-the-numbers.

(2.) Stacy's publications are accessible through her Google Scholar profile: http://scholar.google.com/cit ations?user=eslVzYQAAAAJ&hl=en&oi = ao. Stacy is now a Research Metrics Consultant for Altmetric, an altmetrics tool covered later in this chapter.

(3.) More information about the Altmetric score and how it is calculated is available on the website: https:// www.altmetric.com/whatwedo.php.

(4.) "'Highly Cited' and Other Impact Badges," ImpactStory Feedback website, accessed March 12, 2015, http://feedback.impactstory.org/knowledgebase/ articles/400281--highly-cited-and-other-impact -badges.

Chapter 3

Issues, Controversies, and Opportunities for Altmetrics

For academic librarians attempting to assess the potential of the present-day altmetrics landscape, it is just as important to consider the larger discussions that have emerged surrounding the field of altmetrics as it is to evaluate the strengths and weaknesses of specific altmetrics tools.

As mentioned briefly in chapter 1, the general altmetrics movement has alternatively suffered and benefitted from a number of exaggerations that have circulated about its aims and goals. While misunderstandings are inevitable in any effort to change the way that academe approaches sensitive topics like impact, promotion, funding, and tenure, some of these comments have pointed toward genuine weaknesses that the altmetrics movement has struggled to address, or toward unique strengths on which it is attempting to capitalize.

In this chapter, we look at some of the most important issues to come out of the last five years of altmetrics discussion, including the controversies and opportunities that are most poised to affect its ultimate adoption, negatively or positively, across the wider expanse of higher education.

Controversies Surrounding Altmetrics

Gaming

Of all the criticisms that the altmetrics field has had to weather since its 2010 introduction, the most common by far is the suggestion that it is highly susceptible to "gaming" (see figure 3.1) and thus is a poor match for the rigorous standards of academic evaluation.

Gaming in this context refers to the practice of unscrupulously manipulating a system or set of data in order to produce results that fit a user's desired outcome. Because altmetrics are based explicitly on the collection of web-based data, which may include interactions between research and the general public, critics have accused altmetrics of lacking the security of citation-based approaches to calculating academic impact, which are inevitably more limited in scope and slower to accumulate in value.

To the credit of such critics, it's indisputably true that gaming does occur across the Social Web, from small disingenuous "Like" practices by well-meaning friends and family to the large purchasing of fake followers (figure 3.2), kudos, ratings, or other indicators of online social capital. One need only think back as far as December 2014, when Instagram instigated a massive purge of spam accounts and bots, resulting in the loss of millions of followers by a la mode celebrities like Justin Bieber (lost 3.5 million followers) and Kim Kardashian (lost 1.5 million followers). (1) Those in the business of social media have openly acknowledged how common the practice of purchasing of fake followers is, particularly on sites like Twitter where 1,000 new followers can be had for as little as a few dollars. (2)

Thus, from a general information perspective, there is always a definite risk in assuming the validity of information gleaned from social portions of the Internet, especially when user interactions can be translated to some form of real-world profit. However, the gaming of altmetrics is arguably a topic that requires a slightly more nuanced perspective on the credibility of online information. For instance, we might ask ourselves, are researchers really as likely as celebrities to manipulate metrics in order to promote themselves? What examples do we have of researchers doing this to date? And to the extent that these incidents do or can happen, what measures, if any, have altmetrics product developers taken to combat interference in their ultimate calculations?

As it turns out, attempts to game altmetrics-that is, to increase the perceived impact of research outputs or researchers via the Social Web--are both much less common and more difficult than many critics have assumed. In fact, most of what can be found today on the topic of gaming altmetrics comes directly from altmetrics advocates, who seem to discuss the issue regularly as part of explaining their respective approaches to gathering and measuring online activity (figure 3.3). For instance, Jennifer Lin of PLOS writes in a paper given in 2012 at the altmetrics12 ACM Web Science Workshop:


   In our [article-level metrics] advocacy efforts, we
   have learned that gaming is a widespread concern
   of researchers, institutional decision-makers,
   publishers, and funders. Indeed, one of the hallmark
   features of altmetrics is in fact the difficulty
   of gaming a system comprised of a multi-dimensional
   suite of metrics, setting it apart from the
   impact factor's vulnerabilities. (3)

In a 2013 company blog post, appropriately titled "Gaming Altmetrics," Euan Adie, founder of Altmetric, also situates the idea of gaming altmetrics in the context of general efforts by a small number of researchers to game academic metrics:


   Given that we know a small minority of researchers
   already resort to manipulating citations, it's
   not much of a leap to wonder whether or not an
   unscrupulous author might spend $100 to try and
   raise the profile of one of their papers without
   having to do any, you know, work. How much of
   this goes on? How can we spot it? What should our
   reaction be? (4)

The primary defense of altmetrics against accusations of gaming vulnerability therefore comes down to three main points. First, efforts to game the system of academic merit are already a part of the culture of higher education and include the same players who already try to inflate citation counts to boost their Impact Factors and other bibliometrics credentials. Second, the number of researchers who actually do this is relatively small--nowhere near what we see happening across Instagram and Twitter in general, when cultural capital is really on the line. And third, rather than ignore these warnings and possibilities, most altmetrics providers are taking pains to create safeguards within their already complex systems for assigning relative impact.

This third reason is precisely why altmetrics harvesters are very open about the data sources they include in their calculations and why they include them (e.g., highly auditable or scholarly information). It's also worth noting that in gathering so much data about researchers' online activity, altmetrics providers are good at identifying unusual patterns that suggest intentional or unintentional gaming. (5) This knowledge, combined with the availability of new technology to detect spam accounts, bots, and fake reviews, has reduced the gaming criticism of altmetrics from a major topic of discussion to a reasonably small acknowledgement of risk. (6) Arguably the greater concern for the future of altmetrics is the encouragement of scholarly activities that do not game the system--such as opening up honest conversations about the ways researchers can consciously-yet-scrupulously promote their work in online social spaces like Mendeley, SSRN, ResearchGate, science blogs, and, yes, public social networks like Twitter, too. (7)

Correlation with Bibliometrics

Another area in which altmetrics has faced some controversy is in its correlation with bibliometrics, or more specifically, the lack thereof. As reviewed in chapter 1, bibliometrics and altmetrics share many of the same intentions in seeking to analyze scholarship quantitatively, although their definitions of scholarship and methods of analysis diverge significantly. Nevertheless, with altmetrics offering a much more immediate picture of scholarly impact than citation-based bibliometrics, researchers have naturally been curious about whether altmetrics can be used as a predictor of future citations, which are obviously desirable as a longer term metric of relative scholarly success.

Several studies have been conducted to explore this question over the years, most of which have proved frustratingly inconclusive, contradictory, or unpromising. For instance, a 2013 study of articles from the medical and biological sciences conducted by Thelwall and his colleagues found that six out of eleven altmetrics (Tweets, Facebook wall posts, research highlights, blog mentions, mainstream media mentions, and forum posts) were associated with citation counts, but that "the methods used do not shed light on the magnitude of any correlation between the altmetrics and citations (i.e., the correlation effect size is unknown)." (8) By contrast, a 2014 study of 20,000 Web of Science articles, conducted by Zahedi, Costas, and Wouters and published in Scientometrics, was able to find moderate correlation between Mendeley readership metrics (figure 3.4) and citation indicators (r = 0.49) but also concluded that other altmetrics provided only "marginal information." (9) Many studies published on this subject (noting the lack of altmetrics information for many articles, often due to an absence from key altmetrics-generating networks or databases) have made attempts at finding correlation of any sort between altmetrics and bibliometrics feel largely premature. (10)

Possible limits and explanations aside, the fact that many altmetrics indicators do not seem to correlate with citation indicators has led to uncertainty among some researchers, who continue to feel pressure to provide citation-based evidence of impact to evaluators, yet who may not have sufficient time to let such impact manifest before facing an important deadline. The realization that altmetrics cannot precisely fill this gap may thus be interpreted by some as a failure on the part of the movement. However, the truth is almost certainly something much more complicated, based on the inherent differences between the understanding in the altmetrics field of scholarly impact and the understanding implied by the citation-based methods of bibliometrics. As Priem, Piwowar, and Hemminger suggested as early as 2012 in the conclusion to an article that examined 24,000 articles from PLOS, "Correlation and factor analysis suggest citation and altmetrics indicators track related but distinct impacts, with neither able to describe the complete picture of scholarly use alone." (11) Acceptance of this argument requires both scholars and evaluators to endorse a profound shift in the way that academia has looked for decades at scholarly impact metrics. It is a change that is coming, but coming so slowly that it puts at risk the near-term adoption of altmetrics in critical circles like higher administration, at least without further help.

Inclusion of Metrics from Public Social Media

The third major issue over which altmetrics has encountered significant challenges is its typical inclusion of metrics from nonscholarly social media tools, such as Twitter, Facebook, and YouTube (figure 3.5), in addition to metrics derived from more academically aimed peer networks like Mendeley, ResearchGate, and SSRN.

As stated in chapter 2, nonacademic social media statistics are currently used in altmetrics because of the potential valuable connections they offer between research, researchers, and the general public. However, critics of their inclusion have pointed out a problem: Although many young and media-savvy researchers are active on these networks, a large number of influential researchers are not--an absence that could have a detrimental effect on the altmetrics associated with their research, or with research in certain areas of expertise. This criticism leads to what is perhaps an even more relevant criticism of the inclusion of metrics from non-academic-peer networks--that networks primarily populated by members of the general public are much less likely to be interested in esoteric fields of research than in research that connects to popular topics of discussion like climate change or weight loss.

A 2014 study published in the medical journal Circulation would seem on its face to add weight to this criticism. In it, researchers tracked the thirty-day page views of 243 Circulation articles while specifically attempting to promote the findings of about half the articles (randomized) via the journal's Facebook and Twitter accounts. The authors concluded that there was "no difference in median 30-day page views" between the articles that were specifically promoted via their social media strategy and the articles in the control group. (12) The Circulation study is particularly interesting, as it contradicts the results of previous studies that tracked the effects of promotion on the altmetrics of nonrandomized articles and found a positive relationship between the two, a fact noted by The Scholarly Kitchen blog contributor Phil Davis in a post about the study. (13) However, in the same post, Davis also astutely notes that "Cardiovascular researchers (and other bench and clinical researchers) are very different than computational biologists, social media researchers, and those who spend their days glued to their chairs and computers."

This observation--that public social media metrics are likely more relevant to fields with compatible communication habits, methods, or researcher demographics--is both a convincing retort to, and a valid critique of, the continued use of nonacademic metrics in altmetrics calculations and reports. Either way, however, it raises the question of better refinement of altmetrics research. As Davis writes in another part of his post, "[The study's conclusion] questions whether prior studies were successful in isolating and measuring the effects of social media." (14) In the future, it is likely that we will see more intense discussions about the appropriate context for using public social media metrics alongside other altmetrics, as well as more sophisticated research into the effects of promotion on the metrics derived from non-scholarly-peer networks, and on the changing demographics of social media users within the world of academia (see figure 3.6).

Opportunities Surrounding Altmetrics

Despite the degree of attention paid thus far to the criticisms and controversies around altmetrics, it's fair to say that much, if not most, of the buzz around the field for the last few years has been both positive and promising. Indeed, for academics, administrators, and funders in many areas, the field of altmetrics continues to present a significant and unique opportunity to fill gaps in scholarly impact that have long been in need of attention and that have disadvantaged scholarly outputs that do not fit the mold of citation-based impact. In this section, we look at three of the most notable opportunities presented by altmetrics and the progress of developers and users in making each one a reality.

Article-Level Impact

Arguably one of the most important opportunities opened up by altmetrics for researchers and, indeed, administrators is the uncoupling of the scholarly article from the constraints of the scholarly journal--at least in terms of impact (figure 3.7).

From a bibliometrics perspective, for instance, journal articles are almost always evaluated based on three factors: times cited (i.e., by other articles), journal Impact Factor, and qualitative reviews. However, because published articles typically take at least two years to start generating citation momentum and because fewer articles are reviewed in depth than are published each year by scholars, Impact Factor often becomes the primary substitute for article "quality" in evaluations--this despite the fact that Impact Factor makes no more claims to measure quality than do altmetrics. To base the determination of a specific article's quality, or even just its importance, mostly on a metric for the average number of citations generated by articles published by the same journal over the past two years is a questionable practice on many levels and has led to widespread criticism of the use of Impact Factor in researcher evaluations (figure 3.8).

Into this debate enter article-level metrics (ALMs), or the array of metrics collected around articles in order to show how interest in a specific article builds up over time. Although the concept of ALMs predated the birth of altmetrics by several years, ALMs are related to altmetrics in that they include data sources that go beyond traditional limits, such as usage statistics, comments, ratings, social media mentions, and appearances on notable scientific blogs. To use the explanation offered by the online primer on ALMs published by SPARC, "The attempt to incorporate new data sources to measure the impact of something, whether that something is an article or a journal or an individual scholar, is what defines altmetrics. ...ALMs are about the incorporation of altmetrics and traditional data points to define impact at the article level." (15)

With their attractive combination of metrics from the print and online worlds, ALMs have helped pioneer the idea that a research output's impact can and should be measured primarily by its own quantitative information and not that of the venue in which it appears. The success of this vision has been seen not only in the growth of ALM-friendly journals, like those published online by PLOS, but in the proliferation of ALM-generating archives, such as the Cornell-based arXiv.org (figure 3.9), that make accessible pre- or post-publication articles. By allowing researchers to gather feedback and get additional information about the use and distribution of their written work, these online repositories already have expanded researchers' options for understanding the near-term impact of their articles--all without having to rely on the crutch of venue-based citation averages. The result is a form of scholarly independence on which the field of altmetrics itself has capitalized by promoting metrics for works outside the journal article format that can still garner interactions similar to online articles.

(Multi-)Disciplinary Altmetrics

As mentioned in the section above, another opportunity for which altmetrics has been widely touted is its applicability to a wide variety of scholarly outputs, which makes it theoretically suitable for measuring impact across the disciplines in ways previously frustrated by bibliometrics.

As many tenure-track faculty can attest, impact is a tricky topic to pin down within a given field of study, let alone across multiple fields or disciplines. Consequently, attempts to define impact quantitatively have been unpopular with scholars in many nonquantitative fields, particularly in the arts and humanities, but also in some of the social sciences and theoretical sciences. Still, pressure on university campuses and from funding organizations to present "objective" data regarding researcher impact in addition to standard qualitative evidence has made it difficult for scholars undergoing evaluation to fully ignore the question of quantitative impact measurement.

To make things even more difficult, the academic fields that resist quantitative methods of measuring impact are also typically those that put the least emphasis on the production of journal articles as a standard of researcher productivity. Instead, these areas emphasize outputs like monographs, performances, edited works, and digital research projects. And while this emphasis is entirely valid from a general scholastic standpoint, it nonetheless results in a "weak citation culture" for the fields in question, which frustrates related scholars in search of meaningful citation-based metrics. By contrast, researchers in fields with "strong" citation cultures, like engineering and the biomedical sciences, find themselves not only with greater availability of citation-based metrics like Impact Factor, but also higher numbers of citations for their articles on average. Thus, the difference between a "good" and a "bad" Impact Factor for a researcher in genetics may be up to 20 points, while for a scholar in history, the difference may be as little as 1 or 0.5 (figure 3.10).

The opportunity here for altmetrics, of course, is that altmetrics is not exclusively concerned with definitions of impact that can only be measured through the analysis of article citations. By operating on a level that transcends the idea of citation culture, altmetrics opens up a path to quantitative impact for any scholar whose work can be represented in some capacity on the web. For qualitative researchers, this can mean anything from views, downloads, and saves of textual scholarship (e.g., articles, book chapters, essays, slidedecks) to external Tweets, comments, and ratings of scholarly events (e.g., performances, presentations, exhibitions). What's more, as we saw in chapter 2, altmetrics can also cover works of special relevance to researchers who are already part of strong citation cultures, for example, by collecting information about the use of datasets, code, and pre-publication article drafts.

The opportunity for altmetrics to corner the market on metrics for researchers in the arts, humanities, and interdisciplinary areas while at the same time serving unmet needs for researchers in the sciences and social sciences is one of its greatest potentials. Still, in practice, the field of altmetrics has found itself seriously struggling with some of the same problems as bibliometrics in getting qualitative scholars to participate sufficiently in the movement's culture and practices. For instance, in a 2014 study conducted by Swedish researcher Bjorn Hammarfelt of "humanities-oriented articles and books published by Swedish universities during 2012," Hammarfelt found that coverage remained substantially lacking for humanities publications in key altmetrics-endorsed peer networks, with only 61 percent of the outputs represented via Mendeley readership and 20 percent via Twitter mentions. (16) Another study conducted the same year by Mohammadi and Thelwall that looked specifically at Mendeley coverage of social sciences and humanities publications from 2008 (as pulled from Web of Science) was even less optimistic. It found that 44 percent of social science articles published in 2008 were represented via Mendeley readership, versus only 13 percent of humanities articles from the same period. (17)

While some of these gaps in humanities coverage might be explained by the dates of the articles examined--from 2008, in the second study--or by the country of publication--Sweden, in the first--both studies nevertheless point to a problem in the adoption of seemingly discipline-agnostic academic peer networks like Mendeley by scholars outside of the sciences and social sciences. Additionally, for all the touting of altmetrics as a means of getting beyond the journal article format, instances of altmetrics being actually used productively for purposes of impact measurement and evaluation still tend to focus heavily on articles. According to the conclusion to Hammarfelt's 2014 article, "The possibilities that altmetric methods offer to the humanities cannot be denied but, as shown in this paper, there are several issues that have to be addressed in order to realize their potential." (18) Among these issues are the need for more liaisons and advocates to bring awareness of altmetrics to researchers across the full disciplinary spectrum, as we will discuss in chapter 4.

Public Funding Agencies and Altmetrics

Funding is a third major area in which altmetrics have had an opportunity to shine, in that their short-term, web-based measures of impact have the potential to be highly attractive to agencies that are connected to interests of the general public. Evidence of funding agencies' growing interest in the power of altmetrics can been seen in several areas of the field, starting with the receipt of major grants by multiple altmetrics organizations, including the founders of Impactstory (National Science Foundation and Alfred P. Sloan Foundation); the partnership of the University of California Curation Center, PLOS, and DataOne (National Science Foundation); and researchers behind NISO's Altmetrics Initiative (Alfred P. Sloan Foundation), to which we will return later. (19)

More attractive to most researchers, however, is the suggestion that altmetrics can be useful to the process of applying for major grants and as part of the justification for winning new grants. In January 2013, for instance, the NSF changed the biographical sketch portion of its new grants application (see figure 3.11) to allow principal investigators to list their research "products"--a term that would seem to open the door to outputs beyond the standard scholarly article. In a short editorial written for Nature in the same month, well-known altmetrics advocate Heather Piwowar pointed out the potential relationship between this decision and the use of altmetrics in funding requests. "Even when applicants are allowed to include alternative products in grant applications, how will reviewers know if they should be impressed? ...Many altmetrics have already been gathered for a range of research products." (20)

As tracking the use of altmetrics on grant applications is naturally not easy, it becomes difficult to say how many researchers have taken to heart Piwowar's advice and incorporated altmetrics into their applications for new or renewed funding. That said, a small number of researchers have recently begun to speak up about the practice, like Spanish ecologist Fernando Maestre, in a November 2014 post on his blog Maestre Lab titled, "How I Use Altmetrics Data in My Proposals." (21) Maestre describes using evidence of his research impact from Altmetric, Faculty of 1000, and academic blogs alongside citation-based data from ISI Web of Science and Google Scholar. His example mirrors advice offered to researchers in an essay published almost simultaneously in the online journal PLOS Biology by three members of the funding organization Wellcome Trust. "ALMs and altmetrics offer research funders greater intelligence regarding the use and reuse of research, both among traditional academic audiences and stakeholders outside of academia," explain the authors, two of whom are members of Wellcome Trust's evaluation team. "While conventional citation data will continue to play a major role in research evaluation, the new metrics have the potential to provide a valuable complement to the insights revealed by traditional bibliometric indicators." (22)

The potential for altmetrics to show connections between academic research and nonacademic populations is therefore a strong appeal to funders whose own evaluations often stress the bigger-picture impact of their awarded grants. Nevertheless, with the exact nature of the connection between altmetrics and wider audiences imprecise at best, it's important to stress that funders will all but certainly continue to need significant additional evidence of a strong public connection for altmetrics to become more than an ancillary bonus in the competition for research funding. However, what altmetrics can do--as we saw in chapter 2--is help aid in the discovery of this significant evidence, such as making evident specific comments or blog posts in the course of providing a quantitative perspective on online engagement. This again is a point of clarification that can and should be passed along to researchers, both to encourage the greater use of altmetrics in funding applications and to temper the expectations of what altmetrics information can accomplish by itself. Funders, too, will need to become a more vocal part of the conversation for this opportunity to be fully realized--something that may increase in likelihood following an increase in the appearance of altmetrics on applications or following an appropriate uptick in pressure from influential leaders at the junction of the research and altmetrics communities.

The Future of Altmetrics:

Standards and Institutions

Having now reviewed some of the major controversies and opportunities at play in the current landscape of altmetrics, two questions inevitably arise: First, what's next for the future of this developing field, and second, what is being done to shift the balance of issues away from the risks of altmetrics and toward their proposed rewards?

Speculating about the future of altmetrics is itself a bit risky--but based on the facts at hand, it seems probable that altmetrics will continue to fight a hard fight on certain issues for the next several years, such as the onboarding of more researchers outside of the sciences and social sciences and the inherent demographic problems that come with investing in technologies that favor certain tools and privileges, a la digital divide. In addition, despite the gigantic leap that the field of altmetrics has made in developing new products and garnering interest from key groups like funders and institutions, it remains strangely unclear whether altmetrics is still operating somewhere within the Peak of Inflated Expectations, the second phase of the famous Gartner Hype Cycle. (23) Is the Trough of Disillusionment still to come? Or are we through the worst and really working on the slow Slope of Enlightenment? The answer is hard to guess.

Yet for all these predictions of continued uncertainty, the future does seem to be quite bright for altmetrics with regard to many of its other gaps and weaknesses. Indeed, one particular movement within the field is already helping to address what are almost certainly the problems at the heart of most criticisms of altmetrics: The lack of consistency across the field and the absence of authoritative recommendations for their practical academic use. As mentioned earlier, the National Information Standards Organization (NISO; see figure 3.12) was awarded a two-year Sloan Foundation grant in 2013 in order to study and develop "Community-Based Standards or Recommended Practices in the Field of Alternative Metrics." (24) As standardization is arguably the biggest roadblock to the widespread acceptance of altmetrics by administrators and university evaluators, the existence of the NISO Altmetrics Initiative on its own is excellent news for the future of altmetrics, regardless of the fact that it's still in progress.

Luckily, the progress made to date on the initiative has been extremely positive, as evidenced by the white paper released upon the completion of the project's first phase in June 2014. In it, NISO explains how it was able to hold three in-person meetings and conduct thirty in-person interviews with key stakeholders in the future of altmetrics, which the authors identify as researchers, institutional administrators, librarians, funders, publishers, and members of the general public. (25) Using the information gleaned from these meetings, in addition to a separate online altmetrics survey open to the general public, the project's leaders were able to identify a number of specific objectives for the initiative's second phase, to be completed by November 2015. These objectives include not only the development of a specific definition for what constitutes an alternative assessment metric, but also "definitions for appropriate metrics and calculation methodologies for specific output types," "development of strategies to improve data quality through source data providers," "promotion and facilitation of use of persistent identifiers in scholarly communications," and "descriptions of how the main use cases apply to and are valuable to the different stakeholder groups." (26) Taken together, these projects constitute the Holy Grail of altmetrics development, substantially increasing the clout of the movement and making new strides possible in the use of altmetrics by government agencies, research groups, and educational institutions. The bringing together of altmetrics and the quest for better use of identifiers like DOI and ORCID would also improve the accountability of online scholarship in general, a win that would help address important areas of confusion like multiple versions of online publications and other cases of unnecessary duplication.

Finally, in imagining the future of altmetrics, it's important to acknowledge that when all is said and done, the altmetrics of tomorrow may look very different from the altmetrics we are discussing and debating today. Between the development of new types of networks and harvesters and the proposal of new methodologies for understanding the impact of different types of scholarly outputs, the altmetrics of the future may indeed be something much less "alternative" and instead appear closer to the formal approach to analysis seen in the world of bibliometrics. Even now, two scholars at the United Kingdom's Open University are proposing a new movement of "Semantometrics," which would use full-text semantic analysis of publications to determine their level of contribution across a network of citations. (27)

Thus, what the next phase of altmetrics will be is largely up to the actions, endeavors, and practices of today's advocates and innovators. In the next chapter, we consider what it means for librarians to be one of these catalysts and how some libraries are already making investments in the future of altmetrics, locally and on the grander stage.

Further Reading

"NISO Alternative Assessment Metrics (Altmetrics) Initiative." National Information Standards Organization. www.niso.org/topics/tl/altmetrics_initiative.

The home portal for NISO's high-profile Altmetrics Initiative, a two-year project set to complete in late 2015 that seeks to produce a set of standards and best practices around the use of altmetrics by academics.

Adie, Euan. "Gaming Altmetrics." Altmetric blog, September 19, 2013. www.altmetric.com/blog/gaming -altmetrics.

An insightful blog post written by Altmetric founder Euan Adie in response to discussions about gaming across altmetrics.

Woolston, Chris. "Funders Drawn to Alternative Metrics." Nature 516, no. 147 (December 10, 2014). www.nature.com/news/funders-drawn-to-alternative -metrics-1.16524.

A brief but useful discussion of the early use of altmetrics by researchers submitting grant applications and the positive potential that some funders see in altmetrics-gleaned information.

Notes

(1.) Hannah Jane Parkinson, "Instagram Purge Costs Celebrities Millions of Followers," Guardian, December 19, 2014, www.theguardian.com/technology/2014/ dec/19/instagram-purge-costs-celebrities-millions -of-followers.

(2.) Julie Keck, "Buying Fake Twitter Followers Will Leave You Tweeting to Mannequins," MediaShift, PBS, September 16, 2014, www.pbs.org/mediashift/ 2014/09/buying-fake-twitter-followers-will-leave -you-tweeting-to-mannequins.

(3.) Jennifer Lin, "A Case Study in Anti-gaming Mechanisms for Altmetrics: PLoS ALMs and DataTrust" (paper, altmetrics12 ACM Web Science Conference, Evanston, IL, June 21, 2012), http://altmetrics.org/ altmetrics12/lin.

(4.) Euan Adie, "Gaming Altmetrics," Altmetric blog, September 18, 2013, www.altmetric.com/blog/ gaming-altmetrics.

(5.) Ibid.

(6.) Stacy Konkiel and Jason Priem, "What Jeffrey Beall Gets Wrong about Altmetrics," Impactstory Blog, September 9, 2014, http://blog.impactstory.org/ beall-altmetrics.

(7.) For a slightly longer rumination on this rich topic, we recommend David Crotty's 2013 post and its resulting comments on The Scholarly Kitchen blog: David Crotty, "Driving Altmetrics Performance through Marketing--A New Differentiator for Scholarly Journals?" The Scholarly Kitchen (blog), October 7, 2013, http://scholarlykitchen.sspnet.org/2013/10/07/ altmetrics-and-the-value-of-publicity-efforts-for -journal-publishers.

(8.) Mike Thelwall, Stefanie Haustein, Vincent Lariviere, and Cassidy R. Sugimoto, "Do Altmetrics Work? Twitter and Ten Other Social Web Services," PLOS ONE, May 28, 2013, doi:10.1371/journal.pone .0064841.

(9.) Recent examples of these studies include the previously mentioned study by Thelwall, Haustein, Lariviere, & Sugimoto (Ibid.) and Rodrigo Costas, Zohreh Zahedi, & Paul Wouters, "Do 'Altmetrics' Correlate with Citations? Extensive Comparison of Altmetrics Indicators with Citations from a Multidisciplinary Perspective," Journal of the Association for Information Science and Technology, first published online July 28, 2014, doi:10.1002/asi.23309.

(10.) Zohreh Zahedi, Rodrigo Costas, and Paul Wouters, "How Well Developed Are Altmetrics? A CrossDisciplinary Analysis of the Presence of 'Alternative Metrics' in Scientific Publications," Scientometrics 101, no. 2 (2014): 1491-1513.

(11.) Jason Priem, Heather Piwowar, and Bradley Hemminger, "Altmetrics in the Wild: Using Social Media to Explore Scholarly Impact," arXiv:1203.4745, March 20, 2012, http://arxiv.org/abs/1203.4745.

(12.) Caroline S. Fox, Marc A. Bonaca, John J. Ryan, Joseph M. Massaro, Karen Barry, and Joseph Loscalzo, "A Randomized Trial of Social Media from Circulation," Circulation 131 (2015): 28, doi:10.1161/ CIRCULATIONAHA.114.013509.

(13.) Phil Davis, "Social Media and Its Impact on Medical Research," The Scholarly Kitchen (blog), January 14, 2015, http://scholarlykitchen.sspnet.org/2015/01/ 14/social-media-and-its-impact-on-medical-research.

(14.) Ibid.

(15.) "Article-Level Metrics," SPARC website, accessed January 16, 2015, www.sparc.arl.org/initiatives/ article-level-metrics.

(16.) Bjorn Hammarfelt, "Using Altmetrics for Assessing Research Impact in the Humanities," Scientometrics 101, no. 2 (2014): 1419-30, www.diva-portal.org/ smash/get/diva2:703046/FULLTEXT01.pdf.

(17.) Ehsan Mohammadi and Mike Thelwall, "Mendeley Readership Altmetrics for the Social Sciences and Humanities: Research Evaluation and Knowledge Flows," Journal of the Association for Information Science and Technology 65, no. 8 (August 2014): 1631, doi:10.1002/asi.23071.

(18.) Hammarfelt, "Using Altmetrics," 1429.

(19.) Heather Piwowar and Jason Priem were originally awarded $125,000 by the Sloan Foundation to develop the product that later became Impactstory in 2012. In 2013, the Sloan Foundation awarded them an additional $500,000 to further develop the scalability of Impactstory. That same year, Impactstory received a separate $300,000 Early Concept Grants for Exploratory Research (EAGER) grant from NSF. See Heather Piwowar, "ImpactStory Awarded $500k Grant from the Sloan Foundation," Impactstory Blog, June 17, 2013, http://blog.impactstory.org/sloan, and Heather Piwowar, "ImpactStory Awarded $300k NSF Grant!" Impactstory Blog, September 27, 2013, http://blog.impactstory.org/impactstory -awarded-300k-nsf-grant. The partnership of the University of Curation Center, PLOS, and DataOne was awarded an approximately $300,000 EAGER grant by NSF in September 2014, with reference to a project that would help develop data-level metrics (DLMs). See National Science Foundation, "Making Data Count: Developing a Data Metrics Pilot," award abstract 1448821, December 19, 2014, www.nsf.gov/ awardsearch/showAward?AWD_ID=1448821&Hist oricalAwards = false. NSF awarded NISO $207,500 in May 2014 in order to help develop standards and recommended practices for altmetrics. See "NISO Awarded Sloan Foundation Grant to Develop Standards and Recommended Practices for Altmetrics," NR [NISO Reports], Information Standards Quarterly 25, no. 2 (Summer 2013): 40, www.niso.org/apps/ group_public/download.php/11276/NR_Altmetrics_ Sloan_isqv 25no2.pdf.

(20.) Heather Piwowar, "Altmetrics: Value All Research," Nature 494, no. 159 (January 9, 2013), www.nature .com/nature/journal/v493/n7431/full/493159a.html.

(21.) Fernando Maestre, "How I Use Altmetrics Data in My Proposals," Maestre Lab blog, November 26, 2014, http://maestrelab.blogspot.com/2014/11/how-i-use -altmetrics-data-in-my.html.

(22.) Adam Dinsmore, Liz Allen, and Kevin Dolby, "Alternative Perspectives on Impact: The Potential of ALMs and Altmetrics to Inform Funders about Research Impact," PLOS Biology, November 25, 2014, doi:10.1371/journal.pbio.1002003.

(23.) The five phases of the Gartner Hype Cycle are Technology Trigger, Peak of Inflated Expectations, Trough of Disillusionment, Slope of Enlightenment, and Plateau of Productivity. To learn more about the Gartner Hype Cycle, see www.gartner.com/technol ogy/research/methodologies/hype-cycle.jsp.

(24.) From the title of the 2013 grant proposal authored by Todd Carpenter and Nettie Lagace of NISO. See Todd Carpenter and Nettie Lagace, "Proposal to Study, Propose, and Develop Community-Based Standards or Recommended Practices in the Field of Alternative Metrics," March 19, 2013, www.niso.org/ apps/group_public/download.php/11012/niso-alt metrics-proposal_public_version.pdf.

(25.) "Alternative Metrics Initiative Phase 1 White Paper," NISO, June 6, 2014, www.niso.org/apps/group_ public/download.php/13809/Altmetrics_project_ phase1_white_paper.pdf.

(26.) "Phase 2 Projects," in "NISO Alternative Assessment Metrics (Altmetrics) Initiative," NISO website, accessed January 16, 2015, www.niso.org/topics/tl/ altmetrics_initiative/#phase2.

(27.) Petr Knoth and Drahomira Herrmannova, "Towards Semantometrics: A New Semantic Similarity Based Measure for Assessing a Research Publication's Contribution." D-Lib Magazine 20, no. 11/12 (November/ December 2014), www.dlib.org/dlib/november14/ knoth711knoth.html.


Figure 3.4

Mendeley readership metrics, such as those in this screenshot,
are often credited with having the highest correlation
of any altmetric indicator to the bibliometrics standard
Times Cited.

Readership Statistics

67 Readers on Mendeley

by Discipline

45%   Computer and Information Science
28%   Social Sciences
7%    Education

by Academic Status

48%   Librarian
12%   Other Professional
9%    Ph.D. Student

by Country

13%   United States
10%   Spain
9%    United Kingdom

Note: Table made from bar graph.

Figure 3.6

Social media demographics have become extremely important
when trying to understand the value of altmetrics for
particular academic audiences. For instance, according to
a survey conducted by the Pew Research Internet Project,
74 percent of all online adults used social networking sites
as of January 2014. However, for respondents over age 50,
this percentage was much lower-65 percent to age 64 and
less than 50 percent for those above 65. These statistics,
and related statistics based specifically on the use of social
networking sites by researchers, can be useful when considering
the inclusion of nonacademic social media metrics in
academic contexts. Pew Research Center, "Social
Networking Fact Sheet," accessed January 16, 2015. www.pewinter
net.org/fact-sheets/social-networking-fact-sheet.

Who uses social networking sites
% of internet users within each group who
use social networking sites

All internet users             74%

a Men                          72
b Women                        76
a 13-29                      89 (cd)
b 30-49                      82 (cd)
c 50-64                      65 (d)
d 65+                          49
a High school grad or less     72
b Some college                 73
c College+                     73
a Less than $30,000/yr         79
b $3Q,QQ0-$49,999              73
c $50,000-$ 74,999             70
d $75,000+                     73

Pew Research Center's Internet Project January
Omnibus Survey, January 23-26, 2014.
Note: Percentages marked with a superscri pt letter (e.g., a)
indicate a statistically significant difference between that
row and the row desolated by that superscript letter, among
categories of each demographic characteristic (e.g, age).

PEW RESEARCH CENTER

Chapter 4

Altmetrics and the Role of Librarians

Library Involvement

As altmetrics have emerged and continue to grow and evolve, so too has the role that academic librarians play in supporting altmetrics, metrics, and impact, from support to professional use to advocacy. Any librarian in this field will need to continue not only filling these roles, but also ensuring that they themselves are part of the conversation as it moves forward and staying up-to-date with developments within this area.

The concept of library involvement pertaining to metrics did not originate with altmetrics. As explained in chapter 1, Impact Factor was originally created primarily for use by librarians in making collection development and retention decisions. Libraries continue to bear primary responsibility for the acquisition of bibliometrics tools, most notably Web of Science, Journal Citation Reports, and Scopus, as well as the training of people in their use. As a result, librarians are already familiar with providing support for these tools, so it makes sense that librarians have expanded to support the variety of altmetrics sources and tools discussed in chapter 2.

Additionally, librarians serve as natural leaders when it comes to altmetrics, not only due to familiarity with resources, but also because of the relationships they maintain with several disparate groups. As a result, librarians serve as a neutral voice and advocate on behalf of the needs of their community, while also providing insight about the tools and metrics they help support through their own experience and expertise.

The following sections detail some specific areas in which libraries and librarians are supporting and interacting with altmetrics.

Acquisition/Evaluation/Access

Despite the fact that some altmetrics tools are primarily marketed toward individual scholars, librarians remain the primary gatekeepers when it comes to acquiring and providing access to resources, as well as deciding which resources best fit the needs of their research community. We have already seen altmetrics begin to shift from free resources to a cost structure, and tools like Altmetric Institutional and PlumX require both a subscription and a level of backend support to be successfully implemented at an institution, roles that libraries are already familiar with providing. Thus, it's likely that libraries will continue to serve as gatekeepers for most altmetrics products. However, for products with less recognition, this may also mean some more aggressive actions to ensure that funding is available and also to increase awareness and use once the tools have been purchased, which brings us to the next role libraries play in altmetrics.

Outreach/Training/Marketing

Librarians are uniquely situated to deliver altmetrics information to researchers and to tailor this information based on the varying needs of their user population. Educating users on library tools is hardly a new role, and as with many outreach efforts, librarians must take care to present the information in a way that best resonates with their users. For many, terms like altmetrics may carry no meaning, conjure up overly narrow meaning (equating altmetrics with Twitter counts, for example), or carry unpleasant associations (such as the possibility of gaming). For this reason, librarians must be careful to present information regarding altmetrics in an informative and accessible way, while also taking care to differentiate it from bibliometrics and other similar concepts.

Research guides are a common way to introduce altmetrics, while also providing links to tools and other helpful sources of information. One such guide was developed by the University of Pennsylvania as part of a larger guide, "Research Impact and Citation Analysis" (see figure 4.1). This helps place altmetrics in the appropriate context, while also giving researchers links for additional information (including a separate tab for further reading).

Other common forms of outreach include workshops, one-on-one appointments with researchers, and online tutorials. Handouts can also be an effective advertising tool. A double-sided handout at Curtin University succinctly explains areas of expertise, summarizes services provided, and provides contact information and links to additional information (see figure 4.2).

Communication/Advocacy

While educating users is a vital function of librarians, they are also one of the strongest voices in the altmetrics movement, partly due to their knowledge, but also due to their unique positioning as a neutral voice and central academic hub within their institutions. Additionally, librarians often enjoy open communication lines with many different stakeholders on campus, which places them in a perfect position to facilitate communication when it comes to issues like altmetrics. This means not only communicating with some of those groups individually, but also setting the stage for multiple groups to communicate with each other, directly or indirectly. The following are some specific groups with whom it may be particularly important to communicate and messages that it may be important to impart.

FACULTY AND RESEARCHERS

As mentioned above, awareness and understanding of altmetrics, scholarly impact, and related topics is of primary importance, but encouraging faculty and researchers to take a proactive stance among their colleagues and within their departments or research centers can be an effective means of indirect communication. For example, encouraging faculty to take a look at internal procedures for measuring scholarly impact for things like promotion, merit, tenure, or awards can help these groups consider the role altmetrics can or should play in these procedures.

GRADUATE AND UNDERGRADUATE STUDENTS

Students are an important demographic, since today's students become tomorrow's researchers and are still developing their research skills, so they are often open to incorporating new ideas into their research practice. Since they are often in close contact with other researchers, students can also be effective advocates for altmetrics tools and principles.

ADMINISTRATORS

Research metrics are often used in evaluation decisions, and administrators are often in the position to serve as decision makers. An understanding of altmetrics can help in securing funds necessary to purchase and implement an institutional altmetrics tool and also ensures that altmetrics are used appropriately in the decision-making process. After all, encouraging faculty to incorporate altmetrics into their procedures and files (such as files for tenure) is ineffective if the evaluators reading the file cannot correctly interpret these metrics or misunderstand the context in which they are being delivered.

PUBLISHERS AND TOOLMAKERS

Librarians don't often think of publishers and toolmakers as a group in need of communication--after all, the companies that provide library tools are often in steady contact with libraries. However, a good relationship means developing two-way communication so that we are not only aware of developments from the publishers and toolmakers, but also providing feedback on these tools based on our own observations as well as the needs of the users we serve. As with many aspects of modern scholarship, publishers are unsure about the future of altmetrics or what they mean for publishers. Encouraging practices like the creation of freely available article-level metrics or the incorporation of tools like the Altmetric donut can influence the altmetrics landscape while also helping researchers measure their scholarship in different ways.

SAGE is one publisher that has implemented article-level metrics and the Altmetric donut for several of its journals (see figure 4.3), providing researchers with valuable metrics that would otherwise be difficult to collect.

Collection Development

The idea of using research metrics for their original purpose seems almost foreign in today's scholarly landscape, but metrics still serve as a powerful indicator of journal usage and impact. While librarians may be more familiar with using locally collected usage data, such as COUNTER data, careful application of altmetrics can give librarians additional perspective that can be used to make collection decisions such as journal cancellations. One tool that's particularly effective in this regard is Altmetric Explorer. As detailed in chapter 2, one of the primary purposes of this tool is to help librarians evaluate journals, with the ability to explore altmetrics-based data and create custom reports. Some researchers even advocate for incorporation of altmetrics like CiteULike's social bookmarks or Mendeley's readership metrics into journal evaluation decisions. (1)

Integration with Library Tools

One aspect of altmetrics that has excited many librarians is the ability to incorporate them into existing library tools, most notably institutional repositories. This provides an opportunity to not only bring renewed interest and attention to existing institutional repositories, but also provide an incentive for researchers to deposit their scholarship into a repository, as it can give them access to metrics that may be otherwise unavailable. These same metrics can, in turn, also give librarians additional tools for the evaluation of their own institutional repository. Two notable tools that offer some level of integration with institutional repositories are Altmetric and PlumX. Both make altmetrics data available for individual pieces of scholarship with DOIs or other digital identifiers--Altmetric through its distinctive donut, which can be displayed when viewing an item record within the institutional repository, and PlumX through the PlumX interface, which reports institutional repository metrics for all items, as well as for individual items. For more information, Konkiel and Scherer's 2013 article "New Opportunities for Repositories in the Age of Altmetrics" provides an excellent overview on this subject. (2)

We are also starting to see incorporation of altmetrics into discovery system tools, with Ex Libris's addition of a Metrics tab featuring the Altmetric donut to Primo. With the help of librarian advocacy, the integration of altmetrics into existing tools is a trend that is likely to continue in the future.

Scholarly Research

Librarians who have gained familiarity with altmetrics are well-poised to then actively contribute to the scholarly landscape. Indeed, some of the most prominent names in altmetrics have ties to library and information science, including Jason Priem, a doctoral student in information science; Stacy Konkiel, who served as a scholarly communications librarian before joining ImpactStory, and then Altmetric; and Mike Buschman, a librarian and cofounder of Plum Analytics. The idea of focusing scholarly research on metrics isn't new, as a number of librarians have published articles related to bibliometrics within the past twenty years, so altmetrics research is a natural extension of the same research area. A number of librarians have already given presentations and written articles, white papers, and books on the subject of altmetrics, some of which are detailed below. As we've explored, a number of areas related to altmetrics are in need of more scholarly research and communication, including the integration of altmetrics into libraries and institutions and both research-focused and practical applications of altmetrics.

Self-Evaluation

Finally, librarians are not only educating others about altmetrics and integrating altmetrics into library tools, but are acting as consumers of altmetrics by using these metrics for their own purposes. Librarians tend to be particularly well-situated to benefit from altmetrics since altmetrics provide a mechanism for the quantitative measurement of some scholarly activities with a few methods for evaluative measures, such as the tracking of Twitter comments and conversations during a professional presentation or regarding a scholarly blog posting.

Ways to Stay Current

Given the quickly changing nature of altmetrics, librarians and library administrators wishing to stay on top of recent developments must be proactive in seeking out information since a variety of venues bring different perspectives on the latest activities and trends within altmetrics. The following represent many of these differing viewpoints, but the list is far from exhaustive.

Customized Google News

The popular treatment of altmetrics gives important insight into the translation of this topic beyond the academic realm and often gives hints at the more general and public perspective, along with practical applications of theoretical topics (e.g., an article detailing the use of altmetrics within a CV). Google News can be customized for a variety of topics and sources, so developing a News alert with an altmetrics focus, along with other topics of interest, is relatively easy but does require a Google login to save. As shown in figure 4.4, Google News has been customized: Using the Personalize feature in the right-hand column, topics of interest like "altmetrics" and "research impact" have been added, along with specific sources like "Guardian Weekly" and "Nature." The topic "research impact" has been selected along the left-hand column, displaying recent related news articles. An RSS feed can also be created based on the personalized Google News selections.

Google News

http://news.google.com

Google Scholar Publication Alerts

While Google Scholar searches can be a valuable way to discover many altmetrics publications that fall outside the realm of traditional databases, including white papers, pre-publication manuscripts in arXiv, and nonindexed journals, Google Scholar can also be set up to periodically send e-mails on topics of interest. These options, shown in figure 4.5, are available in the Alerts option when logged in to Google Scholar. More sophisticated search terms such as "scientometrics," "semantometrics," "altmetrics and libraries," or "altmetrics and institutional repositories" are appropriate for these alerts, based on your specific interests.

Google Scholar

http://scholar.google.com

ARL Publication Alerts

While less customizable than the previous two examples, ARL publications are an excellent way to keep up with the latest news, reports, research, and events related to the top research libraries in the United States and include many areas of interest to research libraries, including altmetrics and research impact. There are multiple ways to receive alerts, including signing up for the ARL e-newsletter and receiving updates through Google+, Facebook, or Twitter or directly from the ARL home page.

ARL Publications

www.arl.org/publications-resources

ARL home page

www.arl.org

Toolmaker and Scholarly Blogs

Practically all of the altmetrics toolmakers have developed blogs, which are regularly updated with news items, feedback, questions, and special looks at specific aspects of altmetrics and related issues. These blogs often feature posts directly targeted toward librarians and can also serve as sources for opportunity, such as Impactstory's call in May 2014 seeking applications from librarians and researchers to serve as Impactstory advisors. (3) A sampling of those blogs are listed in the gray box.

Impactstory

http://blog.impactstory.org

Plum Analytics

http://blog.plumanalytics.com

Altmetric

www.altmetric.com/blog

Kudos

http://blog.growkudos.com

Mendeley

http://blog.mendeley.com

In addition to toolmaker blogs, a number of scholarly blogs serve a much-needed role within academia in providing news, opinions, and thoughtful analysis on a number of issues. There are a wide variety of scholarly blogs that serve different purposes and are written from different perspectives. Some have been created by individuals, others have a number of contributors, and others represent a company or organization. Similarly, they cover different aspects of academia, with some specializing in particular aspects, while others represent a broader view. Therefore, these blogs have all covered altmetrics information, trends, and applications to academic research in a meaningful way, but from different viewpoints and with different intended audiences. See several examples in the gray box.

The Citation Culture

http://citationculture.wordpress.com

The Scholarly Kitchen

http://scholarlykitchen.sspnet.org

Research Trends

www.researchtrends.com

Presentations, Workshops, and Invited Speakers

Often, the most useful and up-to-date information tailored to an academic library audience can be obtained through in-person presentations. Within the past two years, the topic of altmetrics has been covered at many prominent library conferences. ACRL, ALA, Charleston, and Internet Librarian International have all featured presentations from librarians; altmetrics tool providers like Altmetric, PLOS, and PlumX; and organizational bodies like NISO. Additionally, several nonlibrary conferences have a strong altmetrics interest, including 1:AM, the 2014 London-based altmetrics conference, and the ACM Web Science Conference series, most recently altmetrics (14). These conferences focus more heavily on the information science and technical perspective, but they represent some of the most cutting-edge altmetrics developments. Finally, consider hosting your own event to increase knowledge locally. One example of this approach is the University of Washington, where the UW libraries partnered with the College of the Environment to bring Heather Piwowar and Jason Priem, cofounders of Impactstory, to their institution to speak about altmetrics.

1:AM London 2014 altmetrics conference

www.altmetricsconference.com

altmetrics14 ACM Web Science Conference

http://altmetrics.org/altmetrics14

University of Washington altmetrics event held

April 24, 2013

www.lib.washington.edu/about/news/exhibits/archive/

tracking-scholarly-impact-online

Books

Altmetrics has reached the point in development where there is now enough information and (relative) stability for monographic publications. As of January 2015, there is one published book, with two more due in the upcoming months. (Full disclosure: One of the books is written by the authors of this report.)

The published book, Beyond Bibliometrics: Harnessing Multidimensional Indicators of Scholarly Impact, is written from the information science perspective and focuses on the technical aspects of scholarly research metrics. As Nature's review of the book points out, it is far from an "accessible working narrative to guide us in our day jobs" and is aimed at other information science researchers developing and analyzing research metrics. (4)

Another book, Altmetrics: A Practical Guide for Librarians, Researchers and Academics, is currently scheduled for publication in December 2015, through the United Kingdom's Facet Publishing. Its editor, Andy Tattersall, is an information specialist at the University of Sheffield, and it features chapters from UK-based researchers and altmetrics toolmakers (Euan Adie from Altmetric and William Gunn from Mendeley). This promises to be a practical guide to altmetrics with a strong European focus.

Facet Publishing page for Altmetrics: A Practical Guide for Librarians, Researchers and Academics

www.facetpublishing.co.uk/title.php?id = 0105

Finally, the book, Meaningful Metrics: A 21st Century Librarian's Guide to Bibliometrics, Altmetrics and Research Impact, was published by ACRL in May 2015. This book takes a broader look at the field of research metrics, with chapters divided between an explanation of theory and hands-on application of the theoretical explanations, including walkthroughs, tips for getting started with different library applications, and tips and features from other librarians working in the field. Its authors are both US librarians (who like to write technical reports in their spare time), and the book is aimed at the academic librarian audience.

Conclusion

In this chapter, we reviewed the various activities in which librarians have engaged with reference to altmetrics, from providing access to the tools that encourage deeper analysis of scholarly impact to the use of altmetrics for their own core activities within research and collection development. However, with greater knowledge of the field both possible and practical, librarians are increasingly well positioned to serve an even wider variety of roles, such as that of educators and information supporters, direct consumers of altmetrics, and facilitators who create opportunities for discourse and debate.

For many, the field of altmetrics continues to be a source of uncertainty and confusion, with changes seeming to occur practically on a daily basis. Nevertheless, the potential value of altmetrics to users of all kinds makes the involvement of knowledgeable parties like librarians a worthwhile investment, not only in the future of the field, but also in the future of academic impact, scholarly communication, and intellectual diversity.

Further Reading and Resources

Chin Roemer, Robin, and Rachel Borchardt. "Institutional Altmetrics and Academic Libraries." Information Standards Quarterly 25, no. 2 (2013): 14-19.

This article, written by the report authors, speculates on the future for institutional-level altmetrics, and also details the roles librarians can play in the altmetrics landscape.

Lapinski, Scott, Heather Piwowar, and Jason Priem. "Riding the Crest of the Altmetrics Wave: How Librarians Can Help Prepare Faculty for the Next Generation of Research Impact Metrics." College and Research Libraries News 74, no. 6 (June 2013): 292-300.

"4 Things Every Librarian Should Do with Altmetrics." Impactstory Blog, June 25, 2014. http://blog .impactstory.org/4-things-librarians-altmetrics.

These two publications, an article, and follow-up blog post, present different angles on a similar theme concerning the role of librarians in altmetrics, with the C&RL News article taking a more research-oriented approach, while the blog post focuses more on the practical side, with more recent pictures and links. Taylor, Michael, Jenny de la Salle, and Kristi Holmes. Librarians and Altmetrics: Tools, Tips and Use Cases. Library Connect webinar, February 20, 2014. http://libraryconnect.elsevier.com/library-connect -webinars. (Requires BrightTalk registration to view.)

This online webinar, hosted by Elsevier, brings together an Elsevier research specialist and two librarians to discuss, what else, librarians and altmetrics.

Notes

(1.) Stefanie Haustein, Multidimensional Journal Evaluation: Analyzing Scientific Periodicals beyond the Impact Factor (Boston: De Gruyter, 2012), 198-214.

(2.) Stacy Konkiel and Dave Scherer, "New Opportunities for Repositories in the Age of Altmetrics," Bulletin of the American Society for Information Science and Technology 39, no. 4 (April/May 2013): 22-26, https://asis.org/Bulletin/Apr-13/AprMay13_Konkiel_Scherer.html.

(3.) Stacy Konkiel, "Do You Have What It Takes to Be an Impactstory Advisor?" Impactstory Blog, May 5, 2014, http://blog.impactstory.org/do-you-have-what -it-takes-to-be-an-impactstory-advisor.

(4.) Jonathan Adams, "Bibliometrics: The Citation Game," Nature 510, no. 7506 (06/26/2014): 470-71.

Robin Chin Roemer is the instructional design & outreach services librarian at the University of Washington Libraries in Seattle, Washington. Her responsibilities include support for online learning, professional and continuing education programs, and information literacy initiatives across the UW Libraries. She holds a bachelor's degree in English from UCLA, a master's degree in English from UC Santa Barbara, and an MLIS from the University of Washington. Robin previously held the role of communication librarian at American University, where she worked closely with departments including journalism, film and media arts, communication studies, and public communication. She is the coauthor of the upcoming book, Meaningful Metrics: A 21st Century Librarian's Guide to Bibliometrics, Altmetrics, and Research Impact, with her report coauthor Rachel Borchardt.

Rachel Borchardt is the science librarian at American University in Washington, DC. Her current job responsibilities include serving as a liaison to seven science departments, as well as responsibilities related to assessment, marketing, and research impact support within the library. She holds dual bachelor's degrees in neuroscience and psychology from Oberlin College, as well as an MLIS with a specialization in medical librarianship from the University of Pittsburgh. Prior to her current position, she served as the biology and neuroscience and behavioral biology librarian at Emory University, and she has recently presented on the subject of altmetrics and academic libraries at several national-level venues, including the Charleston Conference and ALA Midwinter.

Source Citation

Source Citation   

Gale Document Number: GALE|A433686573