With OpenAI and other LLM companies demonstrating the amazing potential of an AI future, It's become an almost unavoidable question: what should we teach students.
I hear all the time about bringing more STEM (Science, Technology, Engineering and Mathematics) education to our schools. I watch (with horror) as some politicians and others try to shift the focus on Universities away from teaching “liberal arts” and towards “job training" or towards "meeting companies' needs". In an AI future, this has become even more important.
Education that is reduced to job-prep skill training will fail us. Any “job” that can be reduced to a series of logical, sequential decisions will be replaced by technology — and very soon.The AI future won't need humans for these tasks.
Humans make bad computers.
We’re not great at repetitive tasks and deductive thinking. We require lots of maintenance and training (and retraining). We want to get paid more as we get more experience. We talk back. We make mistakes. We take breaks and vacations.
So, instead of training the next generation of workers for jobs that likely won’t exist (as they do today), perhaps we should turn our focus to developing capabilities that are uniquely human and will probably always create value. We can teach students to thrive, not struggle to survive the AI future.
1. Critical Thinking
How to ask better questions, how to dig deeper, how to connect dots, how to find patterns, how to deal with ambiguity, how to structure exploration.
How to create, explore, communicate and implement ideas
3. Interpersonal skills
How to relate to other people, how to build and maintain relationships, how to empathize, how to persuade, how to deal with conflict
I’m not arguing that we stop teaching math or science. Not at all. Instead, I’m suggesting that we also teach students how to think better, how to create and how to work with other people. Those skills are timeless and always in demand.
Teach to solve problems, not to answer test questions in an AI future where AI can ace every test
I’m in no way suggesting that we stop teaching STEM, writing, history or any subjects. I’m suggesting that “teaching to the test,” turning liberal arts educations into job training programs (which some people are actually trying to do) and similar “train people like they’re robots/computers being programmed” are going to fail us in an AI future. Science education is a means towards creating, improving, solving real world problems and living a more fulfilling life — not strictly a mechanism to train a person to perform a specific job.
And, I’m arguing that schools should teach students science, math, etc. But, those are means to an end — to being able to think critically, to be able to create, to relate to people, etc. They should be taught not just to be able to pass standard tests or do a job as defined today.
Some believe that younger generations are less able to think critically, but I don’t buy it at all. I might even argue the opposite. Creativity and innovation are happening today at a pace never before experienced. Today’s 20 something’s are more accepting of others, creative and technically savvy than 40, 50 or 60 year olds (are now or were when they were 20). It’s true that fewer of them could “show their work” in a math problem or create paper engineering diagrams. But, their capacity for using technology far exceeds previous generations.
There are always anecdotes and exceptions, but I don’t buy the broader “this generation of kids isn’t as xyz as the last one”. However, there are systemic trends pushing towards future generations not being taught civics, that don’t develop critical thinking skills, are taught to revere authority above all else and who lack real world skills to adapt to a changing world. Look at what Scott Walker has been trying to do with University of Wisconsin or the No Child Left Behind/Common Core programs. I worry a lot about this. Students scoring better on tests can be correlative with better real world skills (great education can produce both results), but higher scores definitely don’t cause better real world outcomes. I hired an intern once who had perfect SATs and grades.
If you told him what to do, he excelled. If you gave him a problem to solve, he’d ask what to do and would wait until you told him. He just didn’t know how to structure his own solution.