This week …
I watched a lot of cricket.
I shall resist the temptation to run a cricket analogy, despite my mind running wild with interesting connections between the New Zealand side’s marvellous victory and school (perhaps due to sleep deprivation). Suffice it to say, a few years ago New Zealand were crap and now they’re world champions; much of the personnel are the same as when they were crap; what’s changed is the culture, which now emphasises process over results, positive relationships, a trust in people, and gives people time to grow.1
They’ve surprised a lot of people.2
An interesting idea: AI amplifying human capacity
I attended an interesting panel discussion on AI in Education3 couple of weeks ago. I was intrigued by the panelists’ view that NZ tech comes at things from a values base that puts people at the centre, which, according them, is unique. It’s something that should be considered when thinking about the role of AI in education.
It’s an idea that’s certainly got me thinking about the education apps out there, most of which are not from NZ.
I didn’t take me long to see some general trends
Many take a behaviourist approach to learning, incentivising it with the ‘lolly’ of a game or some sort after the completion of a task. (Be a good learner and you get to play after).
Student progress is controlled by an algorithm that determines what’s next, and this is called personalisation. (Do well and we’ll give you harder work to help you extend yourself; don’t do well and we’ll keep you here repeating stuff).
Learners often use the app by themselves. If there is the ability to learn with others, this is usually framed as a competition. (Success is up to YOU; you know you’ve been successful when you’re better than others).
People often ask me where technology fits in play based learning, and I usually quote Seymour Papert4, who made this crucial distinction
“the computer is being used to program the child,” [or] “the child programs the computer.”5
I am uneasy when edtech falls into the first category. I have no issues when it falls into the latter, but the problem is not many of them do.
Why should we care? Another big idea that was explored in the panel discussion was that if we are going to develop tech that is values based, AI should enhance our ability to make decisions that reflect our values. It shouldn’t make decisions for us. A crucial component of enabling values based, ‘people centricity’ was ensuring there was always a ‘human in the loop’, which means that one of AI’s roles is in helping to present a clearer view that people can interpret and act from. In other words, AI should supercharge what is noticed so we can supercharge what we value.
Our actions are driven by what we notice, and quite often what we notice and act on are tiny clues - small data. How we respond to those clues changes the context, sometimes in small ways, sometimes in big ways. Have you ever been surprised by what a kid can do once the context is changed? I have, and I’ve seen it so often that I now think that when it comes to learning and the development of ability (or, if you want, achievement) it’s not the kid but the context that determines success. When we outsource learning to the machine, and that machine conditions kids with rewards, determines what next based on their response to that conditioning, and shows them success is beating other people, everything is outcome focused and oriented to a desired result.
In doing this, we’ll probably get a lot of the results and outcomes we expect.6
But we reduce the opportunities kids have to surprise us.7
Does this reflect what we want? Is this really how people grow, learn, become better?
I’m starting to lose track of the number of high achieving sports teams where this is the modus operandi. At the very least, it makes me wonder why schools that proudly fly the academic excellence flag don’t adopt it, instead opting for a relentless focus on competition, outcomes, and results.
If you want to read more about this remarkable change, this article is excellent.
This was a fantastic event hosted by EdTechNZ. A particular highlight was the Year 13 student from Wellington High banging the right to privacy drum loud and clear and challenging some of the basic assumptions schools make about what is fine for people to know about kids.
I recommend his book Mindstorms: Children, Computers, And Powerful Ideas.
Is this the first question that should be asked when looking to adopt technology into schools - which one does this make likely?
This is why behaviourism works as a concept, right? It has a high ‘strike rate’.
Are we ethically bound to keep this door open?