Ed tech insiders are probably used to articles rife with warnings, concerns and other worries, especially around the issue of commercialization. But the piece on learning analytics from this week’s Chronicle of Higher Education was different, and if those ed tech insiders haven’t read it carefully by now, they should.
Part of what was different, of course, had to do with the person who was the focus of the piece: Candace Thille, architect of the Open Learning Initiative and evangelist for adaptive learning systems and the learning analytics that power them. It does make one take notice, and rightfully so, when someone of this stature and deep expertise raises serious questions about the inner workings of these systems that are being developed for sale to our institutions.
This wasn’t a case of academic pearl-clutching over for-profits’ being involved in the noble enterprise of higher education. It was a criticism of the quality of the products themselves, and the inability to have a real discussion about that quality when it’s encased in a proprietary black box.
My problem with this piece? Only that it doesn’t go far enough. Quality control issues, half-baked designs and missed opportunities to apply learning science are all problems that go beyond just learning analytics systems. Characterized by companies’ caginess about what the systems actually do, who authored them, and what specific aspects of learning science they are built on, this syndrome keeps a number of promising products in the zone of not-ready-for-prime time.
Questions about the quality and transparency of commercially-developed systems will continue to be hashed out as ed tech matures. In the meantime, though, we can’t afford to put analytics on hold. We faculty need to take matters into our own hands, and to see what that looks like in practice, you need look no further than what I think was the most important detail of the Thille story: how she uses data to inform each day of her OLI-enhanced course.
In doing so, she’s applying a strategy I think of as small analytics, in a nod to Jim Lang’s latest (and quite wonderful) book Small Teaching. Small analytics constitute those data that we teachers can access and interpret easily, using the expertise we already have and tools already in place. It’s doing what we can, with what we have, to identify and respond to information telling us what students need next to keep their learning moving forward.
Small analytics is me checking to see how many times a student attempted the infinitely repeatable quizzes in my Introduction to Psychology course before they come to see me about their grades. No special tech needed here, just the data readily available in BbLearn and the knowledge that a student who tries the quiz just one time 5 minutes before it is due probably has different issues that the student who tries them all multiple times well in advance but whose scores never improve. It’s the language professor who taught her TAs to look at how quickly an assignment was completed as well as the overall grade, before holding tutoring sessions with students. It’s the engineering professor who weighed speed as well as accuracy in grading student homework on basic principles of circuit analysis. It’s any instructor exploiting data that builds up in online learning systems to diagnose what’s going on with a student and respond in the right way.
What’s key to this approach – and the biggest hurdle to getting it going – isn’t the technology, it’s the routines. Looking back at what Thille said about her own teaching, the most important thing wasn’t the technology she used, but how she used it: consistently, daily, as a natural and integral part of her teaching practice.
So how do we get to this place of routine with small analytics? Think of it like email. There was some point in your life when you hadn’t yet started doing email as part of daily life, either because you weren’t advanced enough in your education or career, or (like me) you are old enough to remember when email was not yet a thing. Maybe you had to constantly remind yourself to make use of this this new-to-you technology. It was a chore on your to-do list, something you forgot, tried again and forgot again.
Then, something happened: Checking email became a natural part of your daily life, something that, if anything, you end up doing too much. Maybe you can’t imagine starting out your work day without it, because you won’t know the latest issues you need to tend to or important information that informs the work you’re about to do.
Now, we all know that email should not substitute for the big goals you wanted to accomplish in your day. And just as constantly checking email isn’t really the main aim of our career, neither is constantly checking analytics the main aim of teaching. But having it be a natural part of our work flow as teachers, and a critical support for doing teaching well, would advance us toward realizing the promise of what ed tech can do.
Of course we need great technology. But technology that doesn’t get used, and used skillfully, may as well not exist. Teaching practices that skillfully use the technology can be cultivated, but not by hectoring people about how they are not being student-centered enough, or by throwing quick fixes at them. They certainly won’t be cultivated just by spending money on a product. Routines are deeply embedded in our lives, and their impact can be profound. They belong in the conversation about analytics in teaching.