AI, Education, & Jobs. We Need To Scrap Everything.

FUTURE
AI, Education, & Jobs. We Need To Scrap Everything.

Few times in my AI analyst career have I seen the future of AI so clearly in my mind as after studying its impact on education.

AI’s capacity to transform education has long been considered one of the supreme use cases, not only for its potential to improve us, but also because, well, you might want to teach your kids how to use the technology that allegedly changes everything; it's kind of a self-fulfilling prophecy.

Knowing this, one would expect that, by now, we should be seeing transformative results; however, the reality is quite different. But not because AIs aren’t good.

We are the problem.

Today, we look at:

  • The state of AI and its impact on education, based on studies from around the world, from Nigeria to MIT,

  • unique experiments being done to give birth to ‘AI-native’ schools,

  • My personal guide to using AIs for learning (which is, and will always be, my primary use case for these models).

  • Finally, we will use our conclusions to demonstrate why not only is the current education system flawed, but so is our labor market.

Enjoy!

State of AI Education

To start, let’s cover what the research is saying about the use of AI in education.

From Nigeria to Texas

The first thing to understand is that research reveals a mixed picture regarding the impact of AI in education.

As this article explains, a test in a school in Nigeria using GPT-4 allowed kids (especially women) to progress the equivalent of two years in just six weeks. The premise was to utilize Generative AI tools as tutors for students to assess whether they improved their English grades.

But not only that, they improved in all categories. As shown below, the probability mass shifted toward higher scores, meaning that the overall grades increased.

Moving on, another piece by the World Economic Forum describes a particularly successful use case by Squirrel AI in China. At a rural primary school in Hubei province, students began taking lessons with an AI tutor from the company Squirrel AI.

After just one month, the school observed “significant improvement in students’ grades, engagement, and confidence”. Squirrel AI’s adaptive learning system diagnoses each student’s knowledge gaps and delivers tailored lessons and practice.

Such AI tutors are helping to fill teacher shortages in remote areas; Squirrel AI now operates in over 2,000 learning centers, serving 24 million students across China. Notably, the company has provided millions of free accounts to low-income families, showing how AI can expand access to quality tutoring.

But in some places, they are taking it a wild step further. In Texas, we have what may be considered the first “AI Native” School, in which kids start leveraging AI from the get-go.

The school, called ‘Alpha School’, has a revolutionary approach: it utilizes personalized artificial intelligence to teach an entire day of core academic lessons in just two hours.

And what do they do the rest of the day? Non-academic activities, such as public speaking and financial literacy.

The approach is so unique that they don’t even have what we call ‘teachers’. Instead, they have ‘guides’ whose role isn’t primarily about teaching, but rather ensuring that the environment is conducive to learning with the AI tutor.

For example, if the task requires engagement between the kids, the guide’s job is to foster that collaboration. And the results are impressive.

According to this video, despite having only two hours of class per day, kids learn twice as fast.

The main selling point is personalization; as the guide isn’t the one teaching, which leads to they having to tailor the lesson to all kids, smarter and laggers, the learning is outsourced to the personalized AI, which focuses on their particular kid and adapts to their doubts, speed of learning, and supports them all the way.

Of course, kids love it. Only two hours of “class” with the rest of the day spent on the things the kid is excited about, swimming, music, art… while ensuring they are constantly interacting with one another, fostering social skills that are profoundly absent in the generations hitting the workforce as you read these lines.

But are parents satisfied?

Some are swearing by it. As seen in this thread, one parent notes that personalized AI tutoring undermines the “teacher-in-front-of-class” model and, what’s more, they were so impressed that this school was one of the reasons they chose to live in Texas.

His own kids are a statement of what this paradigm implies:

  1. Besides their normal AI-led curriculum, one (7) is learning survival skills and learning to cook.

  2. The other (9) is starting a podcast and a dog-walking business.

The outcomes are the least of the troubles here, the point is that these kids expedite learning in the crucial areas (maths, coding, history…) while gaining valuable skills for the real world; if your kid has started six business by the age of 15, you tell me if that kid has more chances of success than one that is still learning by memorizing things for a test they’ll forget as soon as the exam ends.

Thus, using AI not only improves overall learning performance, it also takes ‘learning out of the way’ for kids to do what kids like to do, play, but in ways that help them learn about the world beyond textbooks.

But is all well and good? As with everything, there’s always ‘buts and maybes’.

However, this isn’t because AI invalidates learning; the problem isn't AI. Actually, it’s us.

The Key: It’s not About the Tool, it's How You Use It

I would be lying to you if I told you there isn’t evidence of the issues AI can present when clumsily introduced to children without proper guidance or monitoring.

While some studies and real-life cases like Alpha School show dramatic learning improvements, when AI-based learning was tried in places like Turkey and the Netherlands, the results weren’t as satisfactory, as all Large Language Models (LLMs) such as ChatGPT did was to make students totally dependent on them.

However, this is a common-sense exercise: if you give a human a tool that allows them to outsource effort, without teaching them how to use it properly, what do you expect to happen?

Particularly insightful is MIT’s recent viral study on the effects of using AI to write essays, which I covered in this newsletter recently.

Students were given a writing task on a topic of their choice (from a curated selection) and divided into three cohorts: no tools, Google search, and LLMs. This had several interesting outcomes:

  1. Brain-only students had the best essay scores and, crucially, higher cognitive effort (they measured brain activity using EEG).

  2. LLM-enhanced students exhibited a very similar writing style and poorer performance, indicating that they were using the LLM as a writing and thinking tool.

  3. When LL-enhanced students were later tasked with writing the essay without AI, they performed significantly worse than brain-only students, indicating that an AI-first approach can lead to an overall inferior writer, even without AI.

  4. Conversely, brain-first students who then used LLMs showed improvement in writing, suggesting that, when used as a learning and support tool, LLMs make you a better writer.

What this study is screaming out of its lungs is that AI isn’t the problem; it’s how you use it.

A Guide to Effective AI Use & A Rant on Modern Education

If you’re interested in improving your AI skills or want to introduce AI to kids or family without backfiring, the following recommendations may be of help.

My goal is to share with you, in detail, what I believe is the future of education and work, as well as how I would approach introducing AI to someone for the first time.

The task familiarity vs task complexity conundrum

The entire education system, or the way we currently measure human progress in education, is absurdly broken.

For centuries, as knowledge was not a commodity, those who amassed knowledge were extremely valuable to society; they were the door to answers, and answers led to money.

However, if there’s one thing that AI is commoditizing, it's access to knowledge; the demand and supply curve of knowledge has just suffered a steep supply shock, which naturally drops the value of knowledge-based expertise severely.

I’m talking about shallow knowledge; there’s value in those who can push knowledge further and share new opinions on topics. However, most of the knowledge work and education isn’t profound knowledge; put, knowing the civil laws of the state of New York by heart is no longer as valuable.

The issue is that our education system remains entrenched in this paradigm. We force our kids to memorize stuff and grade their progress based on that.

Instead, we should be assuming the presence of ChatGPT in that kids’ computer, and instead of grading what they know, grade them for what they do with what they know.

It’s high agency; what you do with the information will determine your success, as well as that of your children.

But how do we measure that? How can we modify grading systems to facilitate this change?

And the answer lies in the task familiarity vs. task complexity conundrum, which, incidentally, is also one of the significant problems with modern AI evaluation; even our AI models are being evaluated in the wrong way.

Subscribe to Full Premium package to read the rest.

Become a paying subscriber of Full Premium package to get access to this post and other subscriber-only content.

Already a paying subscriber? Sign In.

A subscription gets you:

  • • NO ADS
  • • An additional insights email on Tuesdays
  • • Gain access to TheWhiteBox's knowledge base to access four times more content than the free version on markets, cutting-edge research, company deep dives, AI engineering tips, & more