frustration with institutional research analytics
- 5 minutes read - 944 words - kudos:Over the summer, I blogged about some concern that I had about a new research portal that my employer had just rolled out. Based on the gentle nudges to update our profiles we’ve been receiving since the platform’s launch, I’m guessing that faculty have not been as keen on the platform as the university is. One of those nudges came this week, and in the spirit of good faith cooperation, I spent some time going through the platform and updating my profile.
Yet, I’m still kind of bugged by the framing that accompanied this most recent reminder to update our profile. In particular, we were told that:
- it’s our responsibility to keep our profile “up to date and correct,” and
- it is in our best interest as researchers to keep that profile up to date, since inside and outside stakeholders use the platform to identify experts and collaborators.
With regard to the first issue, I already put a lot of effort into creating an public-facing online profile of my research (at which this very website is the core), and I’m already hard-pressed to find all the time for research that I would like to have. Given both of these considerations, it frankly seems like a waste of time to add this platform to my to-do list.
Now, the second point would be an excellent counter to my concerns about the first point. If there were individual benefits to my keeping up a platform on this research platform, I’d be more than happy to put in the extra work; in fact, when the platform was first announced, I was pretty eager to check it out in case it could make a solid addition to my web presence. However, the more time I’ve spent with the platform, the less convinced I am that inside and outside stakeholders could use the platform to match my expertise with their needs.
For example, here’s a current look at my “research fingerprint” as determined by the platform:
Now, I’ll admit that the platform doesn’t get things entirely wrong. A plurality of my published studies are contributions to the Educational Technology literature, focused on Teachers’ use of Social Media. I can see where that comes from. I am, though, more than a little confused by how high French ranks under Social Sciences. It’s somewhat flattering, given my longstanding Francophilia, but I only have three publications that explicitly touch on the French language. How, then, does that part of my research profile rank almost as highly as Educational Technology and higher than Social Media, which is relevant to nearly all of my publications?
The bigger question for me is what Computer Science is doing in the second half of my fingerprint. I am not a computer scientist, don’t think of myself that way, and wouldn’t be recognized as one, so that’s one thing by itself. Beyond that, though, I’m amused (if frustrated) by the presence of tags like Cultural Learning, Critical Discourse Analysis and Hegemonic Masculinity under this header. I’m already resentful that Educational Technology is getting lumped under Computer Science, but there’s some (tenuous) logic to that, and at least it also shows up under Social Science. Why does this second category get to lay claim to tags related to culture, CDA, and gender? What is going on here?
With all of this considered together, I’m not convinced that this research fingerprint represents the work that I actually do. In fact, the one time that this platform has even been marginally useful to me was in allowing someone to contact me with a request to present some recent research to an interested party. However, that research came from my dabbling in Mormon Studies, which is not at all represented in this research fingerprint. Now, I will acknowledge that I have only recently added my Mormon Studies work to the platform—those journals aren’t picked up on automatically by the platform, so it takes manual addition; since then, some of those publications (but not all!) have been tagged with “Mormons” or “Latter-day Saints”, and those tags may yet appear on my overall fingerprint. Nonetheless, that certainly wasn’t the case when I was approached about that research presentation. It’s pretty obvious to me that the only reason that the person reached out through the research platform is because it was the highest hit when he searched for my name and affiliation.
That’s frustrating because it knocks my own website, with a more accurate and holistic view of my research, out of the way. Furthermore, because of institutional emphasis on this platform, we are no longer allowed to list our own websites on our faculty profiles, instead trusting that this platform is going to do things for us.
I don’t trust this platform to accurately represent my research, I chafe at the idea that it’s my responsiblity to keep it up to date, and I flatly reject the argument that it’s in my best interest to do so. That doesn’t mean I won’t continue to make good faith efforts to keep this profile up to date, but I will do so with the understanding that this somehow benefits my unit, my college, and my institution. Yet, even though I’m committed to helping out at those levels, I worry about low-quality automated tagging, aggregate quantification, and our growing (and not always critical) adoption on analytics. Quantified analytics may be more convenient for large institutions to get a picture of things, but as my case shows, they also distort the details of what we’re doing. Simply put, this isn’t how the quality or contribution of research is measured, and I’m concerned about moving things in this direction.
- macro
- Work
- Scholars@UK
- analytics
- research
- Elsevier
- University of Kentucky
- French
- Mormon Studies
- quantification
Similar Posts:
research analytics for... industry collaboration?
high school class rankings and the value-laden non-objectivity of quantitative measures
new publication: an autoethnography on French, data science, and paradigm change
Two-Face, DezNat, and Lavina Fielding Anderson—mission compatriots
appearance on Dialogue Out Loud podcast
Comments:
You can click on the <
button in the top-right of your browser window to read and write comments on this post with Hypothesis. You can read more about how I use this software here.
Any Webmentions from Micro.blog will also be displayed below: