ClassDojo and educational 'accomplishment'
- 3 minutes read - 539 words - kudos:As kiddo’s school year has gotten into full swing and mine has gotten busier, I’ve spent less time griping about her school’s use of ClassDojo. However, I’ve also become increasingly annoyed at the fact that the weekly update email I get from the company always has the subject line “What did your child accomplish this week?” The body of the email is divided into two sections: The number of “points” that my child was assigned, and the number of “stories” that my child appeared in. Do points and stories reflect accomplishment?
I’m reminded of what Wiggins and McTighe write about educational assessment in their book Understanding by Design, which has its limits but serves as a major foundation for how I design assessment in my courses. Early in their introduction, the authors give a fictitious example of an elementary school unit about apples: writing stories about apples, making apple art, singing songs about apples, and eventually visiting a local apple orchard. Wiggins and McTighe criticize this unit as overflowing with activities but short on assessment and evidence.
Now, this is one of the areas where I see this book as having its limits. It just so happens that my elementary school-aged kid did an activity very similar to this, and I honestly don’t see the issue with it. I believe that assessment is important, but I also believe that learning happens through activity, that that’s especially true at younger ages, and that my kid is better served by underassessment than overassessment at her age.
That caveat aside, I do think that Wiggins and McTighe’s criticism of activities rather than assessment applies perfectly to these ClassDojo emails. I don’t see how any edtech company can argue with a straight face that “points” and “stories” are educational accomplishments for me to be proud of. Rather, it seems to be entirely backwards from Wiggins and McTighe’s “backwards design”—it starts with what data is available and tries to assign educational value to that rather than asking what has educational value and collecting data on that.
This observation reminded me of an upcoming webinar by Dr. Dirk Ifenthaler for AECT on the “validity of data sources in learning analytics”:
Recent developments in learning analytics, which are a socio-technical data mining and analytic practice in educational contexts, show promise in supporting learning processes and enhancing study success in higher education, through the collection and analysis of data from learners, learning processes, and learning environments to provide meaningful feedback and scaffolds when needed. However, an analysis of more than 35,000 publications shows that rigorous, large-scale evidence on the effectiveness of indicators for learning analytics in supporting learning processes and study success is still lacking. This webinar will review the promises and opportunities of learning analytics and tackle the challenges of implementing indicators into productive higher education ecosystems.
I suspect that learning analytics suffers from this problem a lot: Asking what data is easy to collect rather than asking what data is really important. As education gets more and more datafied, this is more likely to become a larger than a smaller problem, but ClassDojo really takes the cake for trying to sell the data they collect (and sell!) as something more valuable than it really is.
Similar Posts:
ClassDojo and the creation of artificial demand
being a student's parent as an edtech researcher
new report on Google Classroom and ClassDojo
putting my work where my whining is
some thoughts on platforms and 'community'
Comments:
You can click on the <
button in the top-right of your browser window to read and write comments on this post with Hypothesis. You can read more about how I use this software here.
Any Webmentions from Micro.blog will also be displayed below: