Altmetrics and societal impact measurements: Match or mismatch? A literature review
DOI:
https://doi.org/10.3145/epi.2020.ene.02Palabras clave:
Bibliometrics, Societal impact, Altmetrics, Citations, Research evaluation, Scholarly communication, Social media, Twitter, Review article.Resumen
Can alternative metrics (altmetrics) data be used to measure societal impact? We wrote this literature overview of empirical studies in order to find an answer to this question. The overview includes two parts. The first part, “societal impact measurements”, explains possible methods and problems in measuring the societal impact of research, case studies for societal impact measurement, societal impact considerations at funding organizations, and the societal problems that should be solved by science. The second part of the review, “altmetrics”, addresses a major question in research evaluation, which is whether altmetrics are proper indicators for measuring the societal impact of research. In the second part we explain the data sources used for altmetrics studies and the importance of field-normalized indicators for impact measurements. This review indicates that it should be relevant for impact measurements to be oriented towards pressing societal problems. Case studies in which societal impact of certain pieces of research is explained seem to provide a legitimate method for measuring societal impact. In the use of altmetrics, field-specific differences should be considered by applying field normalization (in cross-field comparisons). Altmetrics data such as social media counts might mainly reflect the public interest and discussion of scholarly works rather than their societal impact. Altmetrics (Twitter data) might be especially fruitfully employed for research evaluation purposes, if they are used in the context of network approaches. Conclusions based on altmetrics data in research evaluation should be drawn with caution.
Referencias
Adie, Euan (2014). “Taking the alternative mainstream”. El profesional de la información, v. 23, n. 4, pp. 349-351. https://doi.org/10.3145/epi.2014.jul.01
Bar-Ilan, Judit (2014). “Jasist@ Mendeley revisited”. Altmetrics14: Expanding impacts and metrics, Workshop at Web Science Conference 2014. https://doi.org/10.6084/m9.figshare.1031681
Barthel, Simon; Tönnies, Sascha; Köhncke, Benjamin; Siehndel, Patrick; Balke, Wolf-Tilo (2015). “What does Twitter measure? Influence of diverse user groups in altmetrics”. ACM/IEEE Joint conf on digital libraries (JCDL), Knoxville, TN, USA. http://www.ifis.cs.tu-bs.de/node/3016
Belter, Christopher W. (2018). “Providing meaningful information: Part B - Bibliometric analysis”. In: DeRosa, A. P. (ed.). A practical guide for informationists. Sawston, Cambridge, UK: Chandos Publishing, pp. 33-47. ISBN: 978 0 081020173
Blümel, Clemens; Gauch, Stephan; Beng, Florian (2017). “Altmetrics and its intellectual predecessors: Patterns of argumentation and conceptual development”. Proceedings of the Science, technology, & innovation indicators conference. Open indicators: innovation, participation and actor-based STI indicators. Paris, France. https://sti2017.paris/wp-content/uploads/2017/11/his-bluemel-beng-et-al.pdf
Boling, Erica C.; Castek, Jill; Zawilinski, Lisa; Barton, Karen; Nierlich, Theresa (2008). “Collaborative literacy: Blogs and Internet projects”. The reading teacher, v. 61, n. 6, pp. 504-506. https://doi.org/10.1598/RT.61.6.10
Bonetta, Laura (2007). “Scientists enter the blogosphere”. Cell, v. 129, n. 3, pp. 443-445. https://doi.org/10.1016/j.cell.2007.04.032
Bornmann, Lutz (2016). “Scientific revolution in scientometrics: The broadening of impact from citation to societal”. In: Sugimoto, Cassidy R. (ed.). Theories of informetrics and scholarly communication. Berlin, Germany: De Gruyter, pp. 347-359. ISBN: 978 3 11 030846 4
Bornmann, Lutz (in press). “Bibliometric indicators – methods for measuring science”. In: Williams, Richard (ed.). Encyclopedia of research methods. Thousand Oaks, CA, USA: Sage.
Bornmann, Lutz; Haunschild, Robin (2015). “Which people use which scientific papers? An evaluation of data from F1000 and Mendeley”. Journal of informetrics, v. 9, n. 3, pp. 477-487. https://doi.org/10.1016/j.joi.2015.04.001
Bornmann, Lutz; Haunschild, Robin (2016a). “How to normalize Twitter counts? A first attempt based on journals in the Twitter Index”. Scientometrics, v. 107, n. 3, pp. 1405-1422. https://doi.org/10.1007/s11192-016-1893-6
Bornmann, Lutz; Haunschild, Robin (2016b). “Normalization of Mendeley reader impact on the reader- and paper-side: A comparison of the mean discipline normalized reader score (MDNRS) with the mean normalized reader score (MNRS) and bare reader counts”. Journal of informetrics, v. 10, n. 3, pp. 776-788. https://doi.org/10.1016/j.joi.2016.04.015
Bornmann, Lutz; Haunschild, Robin (2016c). “Overlay maps based on Mendeley data: The use of altmetrics for readership networks”. Journal of the Association for Information Science and Technology, v. 67, n. 12, pp. 3064-3072. https://doi.org/10.1002/asi.23569
Bornmann, Lutz; Haunschild, Robin (2017). “Measuring field-normalized impact of papers on specific societal groups: An altmetrics study based on Mendeley data”. Research evaluation, v. 26, n. 3, pp. 230-241. https://doi.org/10.1093/reseval/rvx005
Bornmann, Lutz; Haunschild, Robin (2018). “Normalization of zero-inflated data: An empirical analysis of a new indicator family and its use with altmetrics data”. Journal of informetrics, v. 12, n. 3, pp. 998-1011. https://doi.org/10.1016/j.joi.2018.01.010
Bornmann, Lutz; Haunschild, Robin; Adams, Jonathan (2019). “Do altmetrics assess societal impact in a comparable way to case studies? An empirical test of the convergent validity of altmetrics based on data from the UK research excellence framework (REF)”. Journal of informetrics, v. 13, n. 1, pp. 325-340. https://doi.org/10.1016/j.joi.2019.01.008
Bornmann, Lutz; Haunschild, Robin; Marx, Werner (2016). “Policy documents as sources for measuring societal impact: How often is climate change research mentioned in policy-related documents?”. Scientometrics, v. 109, n. 3, pp. 1477-1495. https://doi.org/10.1007/s11192-016-2115-y
Bornmann, Lutz; Tekles, Alexander; Zhang, Helena H.; Ye, Fred, Y. (2019). “Do we measure novelty when we analyze unusual combinations of cited references? A validation study of bibliometric novelty indicators based on F1000Prime data”. Journal of informetrics, v. 13, n. 4. https://doi.org/10.1016/j.joi.2019.100979
Bunn, Frances; Trivedi, Daksha; Alderson, Phil; Hamilton, Laura; Martin, Alice; Pinkney, Emma; Iliffe, Steve (2015). “The impact of Cochrane Reviews: a mixed-methods evaluation of outputs from Cochrane Review Groups supported by the National Institute for Health Research”. Health technology assessment, v. 19, n. 28. https://doi.org/10.3310/hta19280
Burke, James; Bergman, Jules; Asimov, Isaac (1985). The impact of science on society. Washington, DC, USA: National Aeronautics and Space Administration. https://trove.nla.gov.au/version/21856423
Bush, Vannevar (1945). Science: the endless frontier. A report to president Truman outlining his proposal for post-war U.S. science and technology policy. Washington, DC, USA: United States Government Printing Office. https://www.nsf.gov/od/lpa/nsf50/vbush1945.htm
Buttliere, Brett; Buder, Jürgen (2017). “Personalizing papers using Altmetrics: Comparing paper ‘Quality’ or ‘Impact’ to person ‘Intelligence’ or ‘Personality’”. Scientometrics, v. 111, n. 1, pp. 219-239. https://doi.org/10.1007/s11192-017-2246-9
Buxton, Martin; Hanney, Steve (1998). “Evaluating the NHS research and development programme: Will the programme give value for money?”. Journal of the Royal Society of Medicine, v. 91, n. 35 suppl, pp. 2-6. https://doi.org/10.1177/014107689809135S02
Callaert, Julie; Van Looy, Bart; Verbeek, Arnold; Debackere, Koenraad; Thijs, Bart (2006). “Traces of prior art: An analysis of non-patent references found in patent documents”. Scientometrics, v. 69, n. 1, pp. 3-20. https://doi.org/10.1007/s11192-006-0135-8
Dance, Amber (2013). “Impact: pack a punch”. Nature, v. 502, pp. 397-398. https://doi.org/10.1038/nj7471-397a
Daugbjerg, Signe B.; Kahlmeier, Sonja; Racioppi, Francesca; Martin-Diener, Eva; Martin, Brian; Oja, Pekka; Bull, Fiona (2009). “Promotion of physical activity in the European region: Content analysis of 27 national policy documents”. Journal of physical activity and health, v. 6, n. 6, pp. 805-817. https://doi.org/10.1123/jpah.6.6.805
De-Jong, Stefan P. L.; Smit, Jorrit; Van-Drooge, Leonie (2016). “Scientists’ response to societal impact policies: A policy paradox”. Science and public policy, v. 43, n. 1, pp. 102-114. https://doi.org/10.1093/scipol/scv023
De-Jong, Stefan P. L.; Van-Arensbergen, Pleun; Daemen, Floortje; Van-der-Meulen, Barend; Van-den-Besselaar, Peter (2011). “Evaluation of research in context: An approach and two cases”. Research evaluation, v. 20, n. 1, pp. 61-72. https://doi.org/10.3152/095820211X12941371876346
Department for Business, Energy & Industrial Strategy (2016). Building on success and learning from experience. An independent review of the Research Excellence Framework. London, UK: Department for Business, Energy & Industrial Strategy. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/541338/ind-16-9-ref-stern-review.pdf
Derrick, Gemma E.; Samuel, Gabrielle N. (2016). “The evaluation scale: Exploring decisions about societal impact in peer review panels”. Minerva, v. 54, n. 1, pp. 75-97. https://doi.org/10.1007/s11024-016-9290-0
Donovan, Claire (2008). “The Australian Research Quality Framework: A live experiment in capturing the social, economic, environmental, and cultural returns of publicly funded research”. New directions for evaluation, n. 118, pp. 47-60. https://doi.org/10.1002/ev.260
Eisenstein, Michael (2016). “Assessment: Academic return”. Nature, v. 533, n. 7601, pp. S20-S21. https://doi.org/10.1038/533S20a
ERIC (2010). Evaluating the societal relevance of academic research: A guide. Delft, The Netherlands: Delft University of Technology. http://resolver.tudelft.nl/uuid:8fa07276-cf52-41f3-aa70-a71678234424
Ernø-Kjølhede, Erik (2000). Scientific norms as (dis) integrators of scientists? Copenhagen, Denmark: Copenhagen Business School. https://research.cbs.dk/en/publications/scientific-norms-as-disintegrators-of-scientists
Ernø-Kjølhede, Erik; Hansson, Finn (2011). “Measuring research performance during a changing relationship between science and society”. Research evaluation, v. 20, n. 2, pp. 131-143. https://doi.org/10.3152/095820211x12941371876544
Etzkowitz, Henry; Leydesdorff, Loet (2000). “The dynamics of innovation: From national systems and “Mode 2” to a Triple Helix of university–industry–government relations”. Research policy, v. 29, n. 2, pp. 109-123. https://doi.org/10.1016/S0048-7333(99)00055-4
Fausto, Sibele; Machado, Fabio A.; Bento, Luiz-Fernando J.; Iamarino, Atila; Nahas, Tatiana R.; Munger, David S. (2012). “Research blogging: Indexing and registering the change in science 2.0”. PLoS one, v. 7, n. 12, e50109. https://doi.org/10.1371/journal.pone.0050109
Feller, Irwin (2017). “Assessing the societal impact of publicly funded research”. The journal of technology transfer (first online). https://doi.org/10.1007/s10961-017-9602-z
Godin, Benoît; Doré, Christian (2005). “Measuring the impacts of science; beyond the economic dimension”. INRS Urbanisation, Culture et Société. HIST Lecture, Helsinki Institute for Science and Technology Studies, Helsinki, Finland. http://www.csiic.ca/PDF/Godin_Dore_Impacts.pdf
Göransson, Bo; Maharajh, Rasigan; Schmoch, Ulrich (2009). “New activities of universities in transfer and extension: Multiple requirements and manifold solutions”. Science and public policy, v. 36, n. 2, pp. 157-164. https://doi.org/10.3152/030234209x406863
Grant, Jonathan (1999). “Evaluating the outcomes of biomedical research on healthcare”. Research evaluation, v. 8, n. 1, pp. 33-38. https://doi.org/10.3152/147154499781777658
Gregersen, Birgitte; Linde, Lisbeth-Tved; Rasmussen, Jorgen-Gulddahl (2009). “Linking between Danish universities and society”. Science and public policy, v. 36, n. 2, pp. 151-156. https://doi.org/10.3152/030234209x406818
Groth, Paul; Gurney, Thomas (2010). “Studying scientific discourse on the Web using bibliometrics: A chemistry blogging case study”. In: Proceedings of the WebSci10: Extending the frontiers of society on-line, Raleigh, NC, USA: Web Science Trust. https://www.researchgate.net/publication/228339919_Studying_scientific_discourse_on_the_Web_using_bibliometrics_A_chemistry_blogging_case_study
Hammarfelt, Björn (2014). “Using altmetrics for assessing research impact in the humanities”. Scientometrics, v. 101, n. 2, pp. 1419-1430. https://doi.org/10.1007/s11192-014-1261-3
Haunschild, Robin; Bornmann, Lutz (2016). “Normalization of Mendeley reader counts for impact assessment”. Journal of informetrics, v. 10, n. 1, pp. 62-73. https://doi.org/10.1016/j.joi.2015.11.003
Haunschild, Robin; Bornmann, Lutz (2017). “How many scientific papers are mentioned in policy-related documents? An empirical investigation using Web of Science and Altmetric data”. Scientometrics, v. 110, n. 3, pp. 1209-1216. https://doi.org/10.1007/s11192-016-2237-2
Haunschild, Robin; Bornmann, Lutz (2018). “Field-and time-normalization of data with many zeros: an empirical analysis using citation and Twitter data”. Scientometrics, v. 116, n. 2, pp. 997-1012. https://doi.org/10.1007/s11192-018-2771-1
Haunschild, Robin; Leydesdorff, Loet; Bornmann, Lutz; Hellsten, Iina; Marx, Werner (2019). “Does the public discuss other topics on climate change than researchers? A comparison of explorative networks based on author keywords and hashtags”. Journal of informetrics, v. 13, n. 2, pp. 695-707. https://doi.org/10.1016/j.joi.2019.03.008
Haustein, Stefanie (2014). “Readership metrics”. In: Cronin, Blaise; Sugimoto, Cassidy R. (eds.). Beyond bibliometrics: harnessing multi-dimensional indicators of performance. Cambridge, MA, USA: MIT Press, pp. 327-344. ISBN: 978 0 262026796
Haustein, Stefanie; Costas, Rodrigo; Larivière, Vincent (2015). “Characterizing social media metrics of scholarly papers: The effect of document properties and collaboration patterns”. PLoS one, v. 10, n. 3, e0120495. https://doi.org/10.1371/journal.pone.0120495
Haustein, Stefanie; Larivière, Vincent; Thelwall, Mike; Amyot, Didier; Peters, Isabella (2014). “Tweets vs. Mendeley readers: How do these two social media metrics differ?”. IT – Information technology, v. 56, n. 5, pp. 207-215. https://doi.org/10.1515/itit-2014-1048
Health Economics Research Group, RAND Europe (2008). Medical research: what’s it worth? Estimating the economic benefits from medical research in the UK. London, UK: Evaluation Forum. https://www.brunel.ac.uk/__data/assets/pdf_file/0008/183455/TAP825EconomicBenefitsReportFULLWeb.pdf
Hellsten, Iina; Leydesdorff, Loet (2020). “Automated analysis of actor–topic networks on twitter: New approaches to the analysis of socio‐semantic networks”. Journal of the Association for Information Science and Technology, v. 71, n. 1, pp. 3-15. https://doi.org/10.1002/asi.24207
Heyeres, Marion; Tsey, Komla; Yang, Yinghong, Yan, Li; Jiang, Hua (2019). “The characteristics and reporting quality of research impact case studies: A systematic review”. Evaluation and program planning, v. 73, pp. 10-23. https://doi.org/10.1016/j.evalprogplan.2018.11.002
Hicks, Daniel J.; Stahmer, Carl; Smith, MacKenzie (2018). “Impacting capabilities: A conceptual framework for the social value of research”. Frontiers in research metrics and analytics, v. 3, n. 24. https://doi.org/10.3389/frma.2018.00024
Higher Education Funding Council for England (2009). Research Excellence Framework. Second consultation on the assessment and funding of research. September 2009/38. Bristol, UK: Higher Education Funding Council for England (HEFCE).
Holbrook, James-Britt; Hrotic, Steven (2013). “Blue skies, impacts, and peer review”. RT. A journal on research policy & evaluation, v. 1, n. 1. https://doi.org/10.13130/2282-5398/2914
Jones, Teresa H.; Hanney, Steve (2016). “Tracing the indirect societal impacts of biomedical research: Development and piloting of a technique based on citations”. Scientometrics, v. 107, n. 3, pp. 975-1003. https://doi.org/10.1007/s11192-016-1895-4
Kassab, Omar (2019). “Does public outreach impede research performance? Exploring the ‘researcher’s dilemma’ in a sustainability research center”. Science and public policy, v. 46, n. 5, pp. 710-720. https://doi.org/10.1093/scipol/scz024
Kellard, Neil M.; Śliwa, Martina (2016). “Business and management impact assessment in Research Excellence Framework 2014: Analysis and reflection”. British journal of management, v. 27, n. 4, pp. 693-711. https://doi.org/10.1111/1467-8551.12186
Khazragui, Hanan; Hudson, John (2015). “Measuring the benefits of university research: Impact and the REF in the UK”. Research evaluation, v. 24, n. 1, pp. 51-62. https://doi.org/10.1093/reseval/rvu028
King’s College London and Digital Science. (2015). The nature, scale and beneficiaries of research impact: An initial analysis of Research Excellence Framework (REF) 2014 impact case studies. London, UK: King’s College London. https://www.kcl.ac.uk/policy-institute/assets/ref-impact.pdf
Kok, Maarten O.; Schuit, Albertine J. (2012). “Contribution mapping: A method for mapping the contribution of research to enhance its impact”. Health research policy and systems, v. 10, n. 1, article 21. https://doi.org/10.1186/1478-4505-10-21
Konkiel, Stacy; Madjarevic, Natalia; Lightfoot, Amy (2016). Altmetrics for librarians: 100+ tips, tricks, and examples. https://doi.org/10.6084/m9.figshare.3749838
Kovic, Ivor; Lulic, Ileana; Brumini, Gordana (2008). “Examining the medical blogosphere: An online survey of medical bloggers”. Journal of medical internet research, v. 10, n. 3, e28. https://doi.org/10.2196/jmir.1118
Kreiman, Gabriel; Maunsell, John H. R. (2011). “Nine criteria for a measure of scientific output”. Frontiers in computational neuroscience, v. 5, n. 48. https://doi.org/10.3389/fncom.2011.00048
Lähteenmäki-Smith, Kaisa; Hyytinen, Kirsi-Maria; Kutinlahti, Pirjo; Konttinen, Jari (2006). Research with an impact: Evaluation practises in public research organisations. Espoo, Finland: VTT Technical Research Centre of Finland. http://www.vtt.fi/inf/pdf/tiedotteet/2006/T2336.pdf
Langfeldt, Liv; Scordato, Lisa (2015). Assessing the broader impacts of research: A review of methods and practices. Oslo, Norway: Nordic Institute for Studies in Innovation, Research and Education (NIFU). https://nifu.brage.unit.no/nifu-xmlui/handle/11250/282742
Lewison, Grant; Sullivan, Richard (2008a). “The impact of cancer research: how publications influence UK cancer clinical guidelines”. British journal of cancer, v. 98, n. 12, pp. 1944-1950. https://doi.org/10.1038/sj.bjc.6604405
Liu, Chun-Li; Xu, Yue-Quan; Wu, Hui; Chen, Si-Si; Guo, Ji-Jun (2013). “Correlation and interaction visualization of altmetric indicators extracted from scholarly social network activities: Dimensions and structure”. Journal of medical internet research, v. 15, n. 11, pp. 17. https://doi.org/10.2196/jmir.2707
Liu, Jean; Adie, Euan (2013). “Five challenges in altmetrics: A toolmaker’s perspective”. Bulletin of the American Society for Information Science and Technology, v. 39, n. 4, pp. 31-34. https://doi.org/10.1002/bult.2013.1720390410
Mansfield, Edwin (1991). “Academic research and industrial innovation”. Research policy, v. 20, n. 1, pp. 1-12. https://doi.org/10.1016/0048-7333(91)90080-a
Mansfield, Edwin (1998). “Academic research and industrial innovation: an update of empirical findings”. Research policy, v. 26, n. 7-8, pp. 773-776. https://doi.org/10.1016/s0048-7333(97)00043-7
Marcella, Rita; Lockerbie, Hayley; Bloice, Lyndsay; Hood, Caroline; Barton, Flora (2018). “The effects of the research excellence framework research impact agenda on early- and mid-career researchers in library and information science”. Journal of information science, v. 44, n. 5, pp. 608-618. https://doi.org/10.1177/0165551517724685
Martin, Ben R. (2007). “Assessing the impact of basic research on society and the economy”. In: Rethinking the impact of basic research on society and the economy, Vienna, Austria. https://www.researchgate.net/publication/228894032_Assessing_the_Impact_of_Basic_Research_on_Society_and_the_Economy
Mas-Bleda, Amalia; Thelwall, Mike (2016). “Can alternative indicators overcome language biases in citation counts? A comparison of Spanish and UK research”. Scientometrics, v. 109, n. 3, pp. 2007-2030. https://doi.org/10.1007/s11192-016-2118-8
Merton, Robert K. (1968a). “The Matthew effect in science: The reward and communication systems of science are considered”. Science, v. 159, n. 3810, pp. 56-63. https://doi.org/10.1126/science.159.3810.56
Merton, Robert K. (1968b). Social theory and social structure. New York, NY, USA: Simon and Schuster. ISBN: 978 0 226520711
Merton, Robert K. (1973). The sociology of science: Theoretical and empirical investigations. Chicago, IL, USA: University of Chicago Press. ISBN: 978 0 226520926
Miettinen, Reijo; Tuunainen, Juha; Esko, Terhi (2015). “Epistemological, artefactual and interactional–institutional foundations of social impact of academic research”. Minerva, v. 53, n. 3, pp. 257-277. https://doi.org/10.1007/s11024-015-9278-1
Mitroff, Ian I. (1974). “Norms and counter-norms in a select group of the Apollo moon scientists: A case study of the ambivalence of scientists”. American sociological review, v. 39, n. 4, pp. 579-595. https://doi.org/10.2307/2094423
Moed, Henk F. (2017). Applied evaluative informetrics. Heidelberg, Germany: Springer. ISBN: 978 3 319 60522 7
Moed, Henk F.; Halevi, Gali (2015). “Multidimensional assessment of scholarly research impact”. Journal of the Association for Information Science and Technology, v. 66, n. 10, pp. 1988-2002. https://doi.org/10.1002/asi.23314
Mohammadi, Ehsan; Thelwall, Mike (2014). “Mendeley readership altmetrics for the social sciences and humanities: Research evaluation and knowledge flows”. Journal of the Association for Information Science and Technology, v. 65, n. 8, pp. 1627-1638. https://doi.org/10.1002/asi.23071
Mohammadi, Ehsan; Thelwall, Mike; Haustein, Stefanie; Larivière, Vincent (2015). “Who reads research articles? An altmetrics analysis of Mendeley user categories”. Journal of the Association for Information Science and Technology, v. 66, n. 9, pp. 1832-1846. https://doi.org/10.1002/asi.23286
Mohammadi, Ehsan; Thelwall, Mike; Kwasny, Mary; Holmes, Kristi L. (2018). “Academic information on Twitter: A user survey”. PLoS one, v. 13, n. 5, e0197265. https://doi.org/10.1371/journal.pone.0197265
Molas-Gallart, Jordi; Salter, Ammon J.; Patel, Pari; Scott, Alister; Duran, Xavier (2002). Measuring third stream activities. Final report to the Russell Group of Universities. Brighton, UK: SPRU, University of Sussex. https://www.researchgate.net/publication/246796517_Measuring_Third_Stream_Activities
Morris, Zoë-Slote; Wooding, Steven; Grant, Jonathan (2011). “The answer is 17 years, what is the question: Understanding time lags in translational research”. Journal of the Royal Society of Medicine, v. 104, n. 12, pp. 510-520. https://doi.org/10.1258/jrsm.2011.110180
Mostert, Sebastian P.; Ellenbroek, Stéfan P. H.; Meijer, Ingeborg; Van-Ark, Gerrit; Klasen, Eduard C. (2010). “Societal output and use of research performed by health research groups”. Health research policy and systems, v. 8, n. 1, article 30. https://doi.org/10.1186/1478-4505-8-30
Narin, Francis (1976). Evaluative bibliometrics: the use of publication and citation analysis in the evaluation of scientific activity. Cherry Hill, NJ, USA: Computer Horizons, 459 pp. http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.473.8004
Narin, Francis; Hamilton, Kimberley S.; Olivastro, Dominic (1997). “The increasing linkage between US technology and public science”. Research policy, v. 26, n. 3, pp. 317-330. https://doi.org/10.1016/S0048-7333(97)00013-9
National Research Council (2014). Furthering America’s research enterprise. Washington, DC, USA: The National Academies Press. ISBN: 978 0 309 30522 8 https://www.ncbi.nlm.nih.gov/books/NBK253897
Nelhans, Gustaf (2016). “Professional impact of clinical research”. In: Ràfols, Ismael; Molas-Gallart, Jordi; Castro-Martínez, Elena; Woolley, Richard (eds.). Proceedings of the 21 st International Conference on Science and Technology Indicators. València, Spain: Universitat Politècnica de València. https://www.researchgate.net/publication/308389908_Professional_impact_of_clinical_research
Noyons, Ed (2018). Monitoring how science finds its way into society: Measuring societal impact through area-based connectedness (ABC). https://www.cwts.nl/blog?article=n-r2u264&title=monitoring-how-science-finds-its-way-into-society-measuring-societal-impact-through-area-based-connectedness-abc
Noyons, Ed (2019). “Measuring societal impact is as complex as ABC”. Journal of data and information science, v. 4, n. 3, pp. 6-21. https://doi.org/10.2478/jdis-2019-0012
Noyons, Ed; Ràfols, Ismael (2018). “Can bibliometrics help in assessing societal contributions of agricultural research? Exploring societal interactions across research areas”. In: Wouters, Paul (ed.), Proceedings of the science and technology indicators conference 2018 Leiden. Science, Technology and Innovation indicators in transition. Leiden, the Netherlands: University of Leiden. http://hdl.handle.net/1887/65366
Nussbaum, Martha C. (2001). Women and human development: The capabilities approach (Vol. 3). Cambridge, UK: Cambridge University Press. ISBN: 978 0 521003858
Nussbaum, Martha C. (2009). Frontiers of justice: Disability, nationality, species membership. Cambridge, MA, USA: Harvard University Press. ISBN: 978 0 674024106
Oancea, Alis (2013). “Buzzwords and values: the prominence of ‘impact’ in UK research policy and governance. Research trends, n. 33, pp. 6-8. https://www.researchtrends.com/issue-33-june-2013/buzzwords-and-values
Ortega, José-Luis (2018). “Disciplinary differences of the impact of altmetric”. Fems microbiology letters, v. 365, n. 7. https://doi.org/10.1093/femsle/fny049
Ovseiko, Pavel V.; Oancea, Alis; Buchan, Alastair M. (2012). “Assessing research impact in academic clinical medicine: A study using Research Excellence Framework pilot impact indicators”. BMC Health services research, n. 12, article 478. https://doi.org/10.1186/1472-6963-12-478
Poege, Felix; Harhoff, Dietmar; Gaessler, Fabian; Baruffaldi, Stefano (2019). “Science quality and the value of inventions”. Science advances, v. 5, n. 12, eaay7323. https://doi.org/10.1126/sciadv.aay7323
Pollitt, Alexandra; Potoglou, Dimitris; Patil, Sunil; Burge, Peter; Guthrie, Susan; King, Suzanne; Wooding, Steven; Grant, Jonathan (2016). “Understanding the relative valuation of research impact: A best–worst scaling experiment of the general public and biomedical and health researchers”. BMJ Open, v. 6, article e010916. https://doi.org/10.1136/bmjopen-2015-010916
Priem, Jason; Piwowar, Heather A.; Hemminger, Bradley M. (2012). Altmetrics in the wild: Using social media to explore scholarly impact. https://arxiv.org/abs/1203.4745
Priem, Jason; Taraborelli, Dario; Groth, Paul; Neylon, Cameron (2010). Altmetrics: A manifesto. http://altmetrics.org/manifesto
Pulido, Cristina M.; Redondo-Sama, Gisela; Sordé-Martí, Teresa; Flecha, Ramon (2018). “Social impact in social media: A new method to evaluate the social impact of research”. PLoS one, v. 13, n. 8, e0203117. https://doi.org/10.1371/journal.pone.0203117
Puschmann, Cornelius; Mahrt, Merja (2012). “Scholarly blogging: A new form of publishing or science journalism 2.0?”. In: Science and the internet, pp. 171-182. Düsseldorf, Germany: Düsseldorf University Press. ISBN: 978 3 943460 16 2 http://dup.oa.hhu.de/id/eprint/149
Raftery, James; Hanney, Steve; Greenhalgh, Trish; Glover, Matthew; Blatch-Jones, Amanda (2016). “Models and applications for measuring the impact of health research: update of a systematic review for the Health Technology Assessment programme”. Health technology assessment, v. 20, n. 76. https://doi.org/10.3310/hta20760
Robinson-García, Nicolás; Costas, Rodrigo; Isett, Kimberley; Melkers, Julia; Hicks, Diana (2017). “The unbearable emptiness of tweeting -About journal articles”. PLoS one, v. 12, n. 8, e0183551. https://doi.org/10.1371/journal.pone.0183551
Robinson-García, Nicolás; Torres-Salinas, Daniel; Zahedi, Zohreh; Costas, Rodrigo (2014). “New data, new possibilities: Exploring the insides of Altmetric.com”. El profesional de la información, v. 23, n. 4, pp. 359-366. https://doi.org/10.3145/epi.2014.jul.03
Rodgers, Emily; Barbrow, Sarah (2013). A look at altmetrics and its growing significance to research libraries. Ann Arbor, MI, USA: The University of Michigan University Library. http://hdl.handle.net/2027.42/99709
Roemer, Robin-Chin; Borchardt, Rachel (2015). Meaningful metrics: A 21st century librarian’s guide to bibliometrics, altmetrics, and research impact. Chicago, IL, USA: Association of College and Research Libraries, A division of the American Library Association. ISBN: 978 0 8389 8755 1 http://www.ala.org/acrl/sites/ala.org.acrl/files/content/publications/booksanddigitalresources/digital/9780838987568_metrics_OA.pdf
Rowlands, Ian (2018). “What are we measuring? Refocusing on some fundamentals in the age of desktop bibliometrics”. FEMS Microbiology letters, v. 365, n. 8, pp. fny059. https://doi.org/10.1093/femsle/fny059
RQF Development Advisory Group (2005). Research quality framework: Assessing the quality and impact of research in Australia. Research impact (report by the RQF development advisory group). Canberra, Australia: Department of Education.
Ruegg, Rosalie T.; Feller, Irwin (2003). A toolkit for evaluating public R&D investment: Models, methods, and findings from ATP’s first decade. Gaithersburg, MD, USA: National Institute of Standards and Technology. https://nvlpubs.nist.gov/nistpubs/gcr/2003/NIST.GCR.03-857.pdf
Samuel, Gabrielle N.; Derrick, Gemma E. (2015). “Societal impact evaluation: Exploring evaluator perceptions of the characterization of impact under the REF2014”. Research evaluation, v. 24, n. 3, pp. 229-241. https://doi.org/10.1093/reseval/rvv007
Searles, Andrew; Doran, Chris; Attia, John; Knight, Darryl; Wiggers, John; Deeming, Simon; Mattes, Joerg; Webb, Brad; Hannan, Steve; Ling, Rod; Edmunds, Kim; Reeves, Penny; Nilsson, Michael (2016). “An approach to measuring and encouraging research translation and research impact”. Health research policy and systems, n. 14, article 60. https://doi.org/10.1186/s12961-016-0131-2
Serrano-López, Antonio-Eleazar; Ingwersen, Peter; Sanz-Casado, Elías (2017). “Wind power research in Wikipedia: Does Wikipedia demonstrate direct influence of research publications and can it be used as adequate source in research evaluation?”. Scientometrics, v. 112, n. 3, pp. 1471-1488. https://doi.org/10.1007/s11192-017-2447-2
Shema, Hadas; Bar-Ilan, Judit; Thelwall, Mike (2012). “Research blogs and the discussion of scholarly information”. PLoS one, v. 7, n. 5, e35869. https://doi.org/10.1371/journal.pone.0035869
Shema, Hadas; Bar-Ilan, Judit; Thelwall, Mike (2014). “Do blog citations correlate with a higher number of future citations? Research blogs as a potential source for alternative metrics”. Journal of the Association for Information Science and Technology, v. 65, n. 5, pp. 1018-1027. https://doi.org/10.1002/asi.23037
Shema, Hadas; Bar-Ilan, Judit; Thelwall, Mike (2015). “How is research blogged? A content analysis approach”. Journal of the Association for Information Science and Technology, v. 66, n. 6, pp. 1136-1149. https://doi.org/10.1002/asi.23239
Siampi (2011). Final report on social impacts of research. Social Impact Assessment Methods for research and funding instruments through the study of productive interactions (SIAMPI). http://www.siampi.eu
Spaapen, Jack B.; Dijstelbloem, Huub; Wamelink, Frank (2007). Evaluating research in context: A method for comprehensive assessment. The Hague, the Netherlands: Consultative Committee of Sector Councils for Research and Development (COS). https://www.qs.univie.ac.at/fileadmin/user_upload/d_qualitaetssicherung/Dateidownloads/Evaluating_Research_in_context_-_A_method_for_comprehensive_assessment.pdf
Spaapen, Jack B.; Van-Drooge, Leonie (2011). “Introducing ‘productive interactions’ in social impact assessment”. Research evaluation, v. 20, n. 3, pp. 211-218. https://doi.org/10.3152/095820211X12941371876742
Sugimoto, Cassidy R.; Work, Sam; Larivière, Vincent; Haustein, Stefanie (2017). “Scholarly use of social media and altmetrics: A review of the literature”. Journal of the Association for Information Science and Technology, v. 68, n. 9, pp. 2037-2062. https://doi.org/10.1002/asi.23833
Tattersall, Andy; Carroll, Christopher (2018). “What can Altmetric.com tell us about policy citations of research? An analysis of Altmetric.com data for research articles from the University of Sheffield”. Frontiers in research metrics and analytics, v. 2, n. 9. https://doi.org/10.3389/frma.2017.00009
Taylor, Mike (2013a). “Exploring the boundaries: How altmetrics can expand our vision of scholarly communication and social impact”. Information standards quarterly, v. 25, n. 2, pp. 27-32. https://doi.org/10.3789/isqv25no2.2013.05
Taylor, Mike (2013b). “Towards a common model of citation: Some thoughts on merging altmetrics and bibliometrics”. Research trends, n. 35, pp. 19-22. https://www.researchtrends.com/issue-35-december-2013/towards-a-common-model-of-citation-some-thoughts-on-merging-altmetrics-and-bibliometrics
Terama, Emma; Smallman, Melanie; Lock, Simon J.; Johnson, Charlotte; Austwick, Martin-Zaltz (2017). “Beyond academia - Interrogating research impact in the Research Excellence Framework”. Plos one, v. 11, n. 12, e0168533. https://doi.org/10.1371/journal.pone.0172817
Thelwall, Mike (2017a). “Three practical field normalised alternative indicator formulae for research evaluation”. Journal of informetrics, v. 11, n. 1, pp. 128-151. https://doi.org/10.1016/j.joi.2016.12.002
Thelwall, Mike (2017b). Web indicators for research evaluation: A practical guide. London, UK: Morgan & Claypool. https://doi.org/10.2200/S00733ED1V01Y201609ICR052
Thelwall, Mike; Kousha, Kayvan (2015). “Web indicators for research evaluation. Part 2: Social media metrics”. El profesional de la información, v. 24, n. 5, pp. 607-620. https://doi.org/10.3145/epi.2015.sep.09
Thelwall, Mike; Maflahi, Nabaeil (2015). “Guideline references and academic citations as evidence of the clinical value of health research”. Journal of the Association for Information Science and Technology, v. 67, n. 4. https://doi.org/10.1002/asi.23432
Van-den-Akker, Wiljan; Spaapen, Jack; Maes, Katrien (2017). Productive interactions: Societal impact of academic research in the knowledge society. League of European Research Universities (LERU). https://www.leru.org/publications/productive-interactions-societal-impact-of-academic-research-in-the-knowledge-society#
Van-der-Meulen, Barend; Rip, Arie (2000). “Evaluation of societal quality of public sector research in the Netherlands”. Research evaluation, v. 9, n. 1, pp. 11-25. https://doi.org/10.3152/147154400781777449
Van-Noorden, Richard (2015). “Seven thousand stories capture impact of science”. Nature, n. 518(7538), pp. 150-151. https://doi.org/10.1038/518150a
Vilkins, Samantha; Grant, Will J. (2017). “Types of evidence cited in Australian government publications”. Scientometrics, v. 113, n. 3, pp. 1681-1695. https://doi.org/10.1007/s11192-017-2544-2
Waltman, Ludo; Costas, Rodrigo (2014). “F1000 Recommendations as a potential new data source for research evaluation: A comparison with citations”. Journal of the Association for Information Science and Technology, v. 65, n. 3, pp. 433-445. https://doi.org/10.1002/asi.23040
Way, Catharine (2015). The millennium development goals report 2015. https://www.un.org/millenniumgoals/2015_MDG_Report/pdf/MDG%202015%20rev%20(July%201).pdf
Willis, Cameron D.; Riley, Barbara; Stockton, Lisa; Viehbeck, Sarah; Wutzke, Sonia; Frank, John (2017). “Evaluating the impact of applied prevention research centres: Results from a modified Delphi approach”. Research evaluation, v. 26, n. 2, pp. 78-90. https://doi.org/10.1093/reseval/rvx010
Wilsdon, James; Allen, Liz; Belfiore, Eleonora; Campbell, Philip; Curry, Stephen; Hill, Steven et al. (2015). The metric tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management. Bristol, UK: Higher Education Funding Council for England (Hefce). https://doi.org/10.4135/9781473978782
Wouters, Paul (2016). Self evaluation report CWTS 2008 - 2015. Leiden, The Netherlands: CWTS.
Wouters, Paul; Zahedi, Zohreh; Costas, Rodrigo (2019). “Social media metrics for new research evaluation”. In: Glänzel, Wolfgang; Moed, Henk; Schmoch, Ulrich; Thelwall Mike (eds.). Springer handbook of science and technology indicators. Cham, Switzerland: Springer International Publishing, pp. 687-713. ISBN: 978 3 030 02511 3
Wu, Lingfei; Wang, Dashun; Evans, James A. (2019). “Large teams develop and small teams disrupt science and technology”. Nature, n. 566, pp. 378-382. https://doi.org/10.1038/s41586-019-0941-9
Xia, Feng; Su, Xiaoyan; Wang, Wei; Zhang, Chenxin; Ning, Zhaolong; Lee, Ivan (2016). “Bibliographic analysis of Nature based on Twitter and Facebook altmetrics data”. PLoS one, v. 11, n. 12, e0165997. https://doi.org/10.1371/journal.pone.0165997
Yu, Houqiang (2017). “Context of altmetrics data matters: An investigation of count type and user category”. Scientometrics, v. 111, n. 1, pp. 267-283. https://doi.org/10.1007/s11192-017-2251-z
Zahedi, Zohreh; Costas, Rodrigo (2018). “General discussion of data quality challenges in social media metrics: Extensive comparison of four major altmetric data aggregators”. PLoS one, v. 13, n. 5, e0197326. https://doi.org/10.1371/journal.pone.0197326
Zahedi, Zohreh; Costas, Rodrigo; Wouters, Paul (2014). “How well developed are altmetrics? A cross-disciplinary analysis of the presence of ‘alternative metrics’ in scientific publications”. Scientometrics, v. 101, n. 2, pp. 1491-1513. https://doi.org/10.1007/s11192-014-1264-0
Zahedi, Zohreh; Fenner, Martin; Costas, Rodrigo (2014). “How consistent are altmetrics providers? Study of 1000 PLoS one publications using the PLoS alm, Mendeley and Altmetric.com APIs”. Altmetrics 14. Workshop, Web Science Conference, Bloomington, USA. https://figshare.com/articles/How_consistent_are_altmetrics_providers_Study_of_1000_PLOS_ONE_publications_using_the_PLOS_ALM_Mendeley_and_Altmetric_com_APIs/1041821
Zahedi, Zohreh; Van-Eck, Nees-Jan (2018). “Exploring topics of interest of Mendeley users”. Journal of altmetrics, v. 1, n. 1, p. 5. https://doi.org/10.29024/joa.7
Descargas
Publicado
Cómo citar
Número
Sección
Licencia
Condiciones de difusión de los artículos una vez son publicados
Los autores pueden publicitar sus artículos de acuerdo con estos términos:
Pasadas 2 semanas desde la publicación (tiempo necesario para que Google indexe la versión de la web de la revista), los autores pueden ofrecer en sus webs (personales o institucionales) o en cualquier repositorio de acceso abierto (OA) una copia del trabajo publicado por EPI. Deberán respetarse sin embargo, las siguientes condiciones:
- Solo deberá hacerse pública la versión editorial. Rogamos que no se publiquen preprints, postprints o pruebas de imprenta.
- Junto con esa copia ha de incluirse una mención específica de la publicación en la que ha aparecido el texto, añadiendo además un enlace clicable a la URL: http://www.profesionaldelainformacion.com
La revista Profesional de la información ofrece los artículos en acceso abierto con una licencia Creative Commons BY .