Educause, Higher Education’s big technology dance, was last week.
Just about every booth, even the one with the elephants, had analytics in every phrase of copy, every piece of collateral.
It was impressive. Particularly the example (demo’d live) that showed a student wondering how their time spent in the course compared with students in the top 10% of the class.
Pretty cool. And the presenters were quick to acknowledge that time-on-task and similar indirect measures of learning were uncertain proxies for achievement.
The partner institution presented preliminary results of their use of analytics: 49 students across 3 courses. They were “just getting going” with data analysis, rolling it out carefully and measuring the effects of measurement.
It is rare in higher education to attend a session where an institution has made concrete changes based on large volumes of data. More on this in a moment.
I liked the student example, because it speaks to the coming opportunities that learners will enjoy…should they take advantage of them. We have the power to let students know that their behavior is very similar to students who achieved or failed.
But do we have the interventions that work?
Will faculty engage with all of this data to work with students?
Will the data, collected by the terabyte in our LMS platforms, tell us which professors are effective? That teaching does matter? Which strategies are effective? There is a collective assumption that these massive investments will take our retention up 8 points, create a wildly better educated and successful student body.
If students aren’t logging in, aren’t participating, we’ll soon have the data at our finger tips.
When they showed up late (or not at all) for a f2f class, then sat in the back half asleep, there was no way for the institution to role up the data and intervene. But faculty teaching the course new. In manufacturing they call it “visual management,” and its lessons were hard won: simple and transparent wins over complicated any day. Did the average faculty member engage with the student who came late, slumped in the back, or refused to engage in conversation? Probably not.
Few educators are publicly engaging with the truths we can already tease from vast, existing datasets. If someone really wanted to know, they could. We don’t need another enterprise-level data-BI-warehouse-analytics package to do basic research. But still we have conference presentations featuring 49 students, or data that shows 90%+ of students are “at or above expectations.”
Like parents realizing they have little effect on pre-teen and teenage children, that the die has been cast, universities will come to understand that personal responsibility and motivation are a huge differentiator among traditional 18-24 year-olds.
So who will be motivated to act, given the immense dataset already available?
If universities were going to act, they could have already. The data to complete sophisticated one-off analyses exists. Arguably wringing wisdom from the data and making it visible to stakeholders, as in the “what do the top 10% of students do?” could be revealing, even revolutionary.
What would make students act? What would motivate students?
A colleague and I hit upon a thought experiment:
What if students were notified that their behavior (logins, assessment results, reading activity in ebooks, discussion activity) was sub-par, and that if the student didn’t show a marked improvement they would be released from the course and their pro-rated tution rebated.
The institution would be in danger of losing tuition – dramatically focusing faculty interest on engaging students through fear of revenue declines. And students would be forced to consider what is required of them, in the grand bargain that is higher education. Rather than take their money, knowing that large numbers of students are heading for failure and debt (my engineering program started in an auditorium of 600+ students listening to the proud mantra of “look to your left, look to your right – they won’t be here when you graduate”), students would be ejected from the school.
Would schools simply increase grade inflation, passing everyone? Or would interests align: more and better work, increased clarity of purpose, performance data calculated with best-effort correctly assumed? How would “faculty engagement with assessment” change if 80% of students were clearly, visibly, striving to achieve the learning outcomes set out in a given course?