Updating search results...

Search Resources

2 Results

View
Selected filters:
Teaching Data Analysis in the Social Sciences: A case study with article level metrics
Conditional Remix & Share Permitted
CC BY-NC-SA
Rating
0.0 stars

This case study is retrieved from the open book Open Data as Open Educational Resources. Case studies of emerging practice.

Course description:

Metrics and measurement are important strategic tools for understanding the world around us. To take advantage of the possibilities they offer, however, one needs the ability to gather, work with, and analyse datasets, both big and small. This is why metrics and measurement feature in the seminar course Technology and Evolving Forms of Publishing, and why data analysis was a project option for the Technology Project course in Simon Fraser University’s Master of Publishing Program.

The assignment:

“Data Analysis with Google Refine and APIs": Pick a dataset and an API of your choice (Twitter, VPL, Biblioshare, CrossRef, etc.) and combine them using Google Refine. Clean and manipulate your data for analysis. The complexity/messiness of your data will be taken into account”.

Subject:
Applied Science
Information Science
Social Science
Sociology
Material Type:
Case Study
Author:
Alessandra Bordini
Juan Pablo Alperin
Katie Shamash
Date Added:
03/27/2019
Use of the Journal Impact Factor in academic review, promotion, and tenure evaluations
Unrestricted Use
CC BY
Rating
0.0 stars

The Journal Impact Factor (JIF) was originally designed to aid libraries in deciding which journals to index and purchase for their collections. Over the past few decades, however, it has become a relied upon metric used to evaluate research articles based on journal rank. Surveyed faculty often report feeling pressure to publish in journals with high JIFs and mention reliance on the JIF as one problem with current academic evaluation systems. While faculty reports are useful, information is lacking on how often and in what ways the JIF is currently used for review, promotion, and tenure (RPT). We therefore collected and analyzed RPT documents from a representative sample of 129 universities from the United States and Canada and 381 of their academic units. We found that 40% of doctoral, research-intensive (R-type) institutions and 18% of master’s, or comprehensive (M-type) institutions explicitly mentioned the JIF, or closely related terms, in their RPT documents. Undergraduate, or baccalaureate (B-type) institutions did not mention it at all. A detailed reading of these documents suggests that institutions may also be using a variety of terms to indirectly refer to the JIF. Our qualitative analysis shows that 87% of the institutions that mentioned the JIF supported the metric’s use in at least one of their RPT documents, while 13% of institutions expressed caution about the JIF’s use in evaluations. None of the RPT documents we analyzed heavily criticized the JIF or prohibited its use in evaluations. Of the institutions that mentioned the JIF, 63% associated it with quality, 40% with impact, importance, or significance, and 20% with prestige, reputation, or status. In sum, our results show that the use of the JIF is encouraged in RPT evaluations, especially at research-intensive universities, and indicates there is work to be done to improve evaluation processes to avoid the potential misuse of metrics like the JIF.

Subject:
Applied Science
Health, Medicine and Nursing
Information Science
Life Science
Social Science
Material Type:
Reading
Author:
Carol Muñoz Nieves
Erin C. McKiernan
Juan Pablo Alperin
Lesley A. Schimanski
Lisa Matthias
Meredith T. Niles
Date Added:
08/07/2020