000048218 000__ 03426cam\a22004335i\4500 000048218 001__ 48218 000048218 003__ SzGeWIPO 000048218 005__ 20230817181942.0 000048218 008__ 160922s2016 maua|||| b||||001|0 eng|d 000048218 020__ $$a9780262337656$$qeBook 000048218 020__ $$a9780262035125$$qPrint 000048218 035__ $$a(OCoLC)1393694016 000048218 040__ $$aSzGeWIPO$$beng$$erda$$cSzGeWIPO 000048218 041__ $$aeng 000048218 050_4 $$aQ180.55.E9 000048218 08204 $$a020.727$$223 000048218 1001_ $$aGingras, Yves,$$d1954-,$$eauthor. 000048218 24510 $$aBibliometrics and research evaluation :$$buses and abuses /$$cYves Gingras. 000048218 264_1 $$aCambridge, Massachusetts :$$bThe MIT Press,$$c2016. 000048218 300__ $$axii, 119 pages :$$billustrations (black and white) ;$$c24 cm 000048218 336__ $$atext$$btxt$$2rdacontent 000048218 336__ $$atext$$2rdacontent 000048218 337__ $$aunmediated$$bn$$2rdamedia 000048218 338__ $$avolume$$bnc$$2rdacarrier 000048218 4901_ $$aHistory and foundations of information science 000048218 504__ $$aIncludes bibliographical references and index. 000048218 5050_ $$aIntroduction -- 1. The Origins of Bibliometrics -- 2. What Bibliometrics Teaches Us about the Dynamics of Science -- 3. The Proliferation of Research Evaluation -- 4. The Evaluation of Research Evaluation -- Conclusion: The Universities’ New Clothes? -- Index 000048218 520__ $$aThe research evaluation market is booming. "Ranking," "metrics," "h-index," and "impact factors" are reigning buzzwords. Government and research administrators want to evaluate everything -- teachers, professors, training programs, universities -- using quantitative indicators. Among the tools used to measure "research excellence," bibliometrics -- aggregate data on publications and citations -- has become dominant. Bibliometrics is hailed as an "objective" measure of research quality, a quantitative measure more useful than "subjective" and intuitive evaluation methods such as peer review that have been used since scientific papers were first published in the seventeenth century. In this book, Yves Gingras offers a spirited argument against an unquestioning reliance on bibliometrics as an indicator of research quality. Gingras shows that bibliometric rankings have no real scientific validity, rarely measuring what they pretend to. Although the study of publication and citation patterns, at the proper scales, can yield insights on the global dynamics of science over time, ill-defined quantitative indicators often generate perverse and unintended effects on the direction of research. Moreover, abuse of bibliometrics occurs when data is manipulated to boost rankings. Gingras looks at the politics of evaluation and argues that using numbers can be a way to control scientists and diminish their autonomy in the evaluation process. Proposing precise criteria for establishing the validity of indicators at a given scale of analysis, Gingras questions why universities are so eager to let invalid indicators influence their research strategy. 000048218 650_0 $$aResearch$$xEvaluation. 000048218 650_0 $$aBibliometrics. 000048218 650_0 $$aEducation, Higher$$xResearch$$xEvaluation. 000048218 650_0 $$aUniversities and colleges$$xResearch$$xEvaluation. 000048218 650_0 $$aIndustrial property. 000048218 7001_ $$aBuckland, Michael;,$$eeditors. 000048218 830_0 $$aHistory and foundations of information science. 000048218 85641 $$uhttps://ebookcentral.proquest.com/lib/wipo/detail.action?docID=5966394$$yView eBook 000048218 903__ $$aHistory and foundations of information science. 000048218 904__ $$aBook 000048218 980__ $$aBIB