REM and Impact Factors

What do REM and impact factors have in common?! To a few of you, the answer may be something to do with sleep. But to my mind, the movement to reconsider the place of traditional measures of scholarly publishing (aka alt-metrics) is fast becoming the new alternative rock, a challenge to the scholarly status quo. Yep, that’s a pretty tortuous metaphor- and not being an alternative rock fan, my knowledge is based on the trusty Wikipedia article, but either way, alt-metrics is becoming visible and more widespread. And just as REM went on to fame and fortune, librarians too need to be aware of where this may lead.
Alt-metrics (or Alternative Metrics) aims to track scholarly impact on the social web. It is an approach that attempts to supplement traditional citation measures of quality by taking into account how researchers work on the open web in the 21st century. The idea of quality has always been important in academia, not least for the promotion system, funding agencies and for the development of personal reputation. Within the traditional, limited print publication system, peer review, citation counting and journal impact factors have formed the backbone of these measures of academic excellence. In the web 2.0 world, however, these systems of measurement have started to be seen as too limiting. Detractors pointed out that reviewers are not held accountable, that context and impact outside academia are ignored, and that it is relatively easy to game the system. Furthermore, information, knowledge and learning have changed. Knowledge is no longer confined to journal output, especially in the still far too closed world of academic publishing. Scholarship is becoming far more diverse, and information, data and evidence of learning can be found in social citation tools or through self publishing such as blogs or social media. By looking at readership or re-use statistics as well as citation statistics, a richer picture of the influence of a piece of work can be formed.
Crucially, alt-metrics do not claim to provide a complete new system to measure impact. They are designed to be used in conjunction with more traditional tools. And with the glacial rate of change in the academy, it is clear that new measures of impact may take a while to develop. However, as the success of the Open Access movement in Latin America shows, it is also evident that for us as subject specialists, we need to be even more aware of the potential for change in our area of expertise.
Tools to measure impact:
Total Impact: Measures readership and re-use across several sites such as Mendeley, Slideshare, Delicious, Wikipedia and Twitter, among others. Ability to search by DOI, URL or Mendeley library means that it’s one of the most complete tools around. It gives numbers of mentions/tool.
ReaderMeter: Designed to provide more real-time impact, Readermeter has adapted the H and G index to measure readership (bookmarks) instead of citations. It relies quite heavily on Mendeley data.
For other tools (particularly for the sciences) see the AltMetrics tools.
Tools to establish authority:
As digital scholarship practices become more established, scholars should establish an online presence to establish authority and cement a digital reputation. The following tools can help:
Google Scholar Citations: When people search by author on Google Scholar, scholar profiles that show personal details and citation information will be displayed. This also provides basic and more traditional citation metrics such as the H index and i10 index.
Mendeley: Public profiles, which are indexed on Google provide personal details as well as relevant article statistics. Mendeley provides readership statistics by cumulative total, as well as readership statistics per article (including readership by discipline, academic status, and country.)