
Better known as the Telmarine who revolted against his own people to side with the Narnians.
Continuing the series about technology–look here for the first introductory post, The Lion, The Witch, and the Wardrobe.
The drive for “personalized learning,” also known as competency-based education, is the hot, new reform pushed by those who believe that technology will save the day.
The dream is for warehouse-style buildings to house students for eight hours a day to sit in front of computer screens and learn from the programs presented to them.
Reformers tout the benefits: continuous testing, data generation, cost-effectiveness (no need to pay professional teachers, the computer will do the work, and all that’s needed is a minimum-wage paraprofessional to monitor behavior of 50 to 100 students), students learning at their own pace, and profitability of the new model because businesses can open and run these schools. No need for messy democracy, local control via an elected school board, and every other obstacle that stops them from raiding their state treasuries for the tax dollars.

This is a scenario worthy of Isaac Asimov: which is better, the silicon-based teacher or the carbon-based teacher? Reformers want a silicon-based teacher for your children. Their yearning for a teacher-proof classroom is a wish to remove a carbon-based professional from the room. But the carbon-based teacher has an advantage over the silicon-based teacher that the silicon will never obtain.
You see, the silicon-based teacher, that personalized learning a/k/a competency-based education, is provided by a machine powered by artificial intelligence (AI) algorithms that were written by a human being.
That fact is not a refutation. AI algorithms, more commonly called bots, run off on their own and soon their human creators can no longer control their data collection and their reactions given their analysis of their data set.
Isaac Asimov tapped into humanity’s deepest fear about technology and machine-like humans a/k/a androids and wrote his three basic laws of robotics:
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
The vision of the reformers violates the first two laws.
Technology must not harm a human being, either through action or inaction. Yet technology concerns abound:
- Too much screen time harms young children. It isolates them at a time when their primary means of learning should be play. Indeed, it can delay the development of early reading and math skills it is supposed to promote. It delays development of social skills, which can lead to later symptoms of anxiety and depression in school age children.
- Brain changes are taking place in children spending hours in front of screens. In particular, a premature thinning of the cortex takes place.
- Screen time can affect sleep and exercise that cause issues with obesity and other disorders.
The idea that human teachers can be replaced by technology and children can learn from spending hours online violates the first law.
Technology is a tool that should be used by the teacher, not controlling the instructional decisions of the teacher. Even iReady, a much panned educational program from Curriculum Associates, has its software written to turn off a learning module if a child fails it twice. It alerts the human teacher that intervention is needed. It is the teacher’s job to determine the next steps and provide appropriate instruction.
Yet, as an iReady associate admitted to GOT one time, district officials cut off her presentation when she was about to tell teachers how to modify iReady’s instructional pathway if the teacher decided a student should work in a different sequence or needed work in a different topic.
Without human teachers making those decisions, children are at the mercy of the technology. As technology feels no emotions, said mercy is not likely to exist. The vision of dozens of children in a large room, supervised by a paraprofessional whose job is to control misbehavior and keep them on task, having their learning controlled by the AI bots of the software, violates the second law.
Rather than having robots obey humans, children must obey the software bots.
Bill Gates writes software for computers, Laurene Powell Jobs is the widow of the creator of Apple computers, Jeffrey Zuckerberg runs the world’s largest online social media platform, Reed Hastings provides entertainment via internet streaming … the list goes on. Everyone of these self-designated education experts believes in technology because technology is their life and they have made billions of dollars in personal wealth from it.
It’s only human that their beliefs would be swayed into thinking that technology is the best method for educating children.
They are wrong. The tech-based vision of technology violates the laws of robotics, which even AI bots should obey.