Heading into finals, campus sent out a message about AI detection tools maybe not being trustworthy, which is great. However, this is in the context of these tools being wrapped into plagiarism detection software we already have access to, so they should say the same about it, too.
Similar Posts:
Canvas: “You can draw conclusions about student participation with our analytics!” Also Canvas: “Mobile page view data aren’t exact, and our analytics only update every 24 hours, so don’t draw too many conclusions, lol.”
🔗 linkblog: my thoughts on 'Software to detect school threats online is costly but mostly ineffective.'
The focus on student learning in this year’s AECT reviews is good, but I worry that it blinds us to other important ed tech questions. I’d struggle to describe how surveillance, ethics, privacy impact student learning, but we desperately need that research too-or more!
🔗 linkblog: my thoughts on 'ChatGPT Is So Bad at Essays That Professors Can Spot It Instantly'
The desire to “enhance” or “improve” learning is a noble one, but I’m increasingly convinced it gets too much attention—and distracts us from as (or more) important questions about education and technology.
Comments:
You can click on the <
button in the top-right of your browser window to read and write comments on this post with Hypothesis. You can read more about how I use this software here.
Any Webmentions from Micro.blog will also be displayed below: