Personalised learning: Edtech’s long-standing aspiration

Published on
April 27, 2023
by
Chris Rainville
Personalised learning: Edtech’s long-standing aspiration

Imagine a world in which each student receives a personalized education – just as if they had access to a one-to-one relationship with a highly skilled tutor. A world in which the pace, sequence and approach to learning is unique to every student.

Previously, achieving this objective required a “teacher to student ratio” approaching 1-to-1 and exceptional pedagogues. An impossible aspiration in the context of shrinking real education budgets, difficulties retaining teachers, and growing class sizes in the UK and every other high-income nation (1, 2). Given recent breakthroughs in large language models (LLMs) and generative AI, we believe that the marginal cost of delivering this experience is now riding down the curve of Moore’s Law – instead of scarce (and much more expensive) educators. Edtech’s promise to enable a high-quality and personalized learning experience, and outcome, is no longer a dream and can now be fulfilled.

Why now?

We have been to this movie before. Many Edtech companies over recent decades have tried to offer personalized learning and failed. In 2007, Simon was a founding director alongside Josh Kopelman in Knewton, which went on to raise over $180M to pursue this vision but never reached the promised land. We believe LLMs are a key missing ingredient, here’s why:

→ 1. LLMs provide a new set of building blocks that cheaply and effectively solve the limitations of past personalized learning attempts

One of the main innovations of GPT-type models is that a task can be asked of the model with no examples of the output required, also called “few shot” or “zero shot”. Not having to fine-tune with a large corpus of examples reduces the amount of upfront investment required to get a model to be useful and increases the range of possible applications of AI.

Credit: Dr. Sik-Ho Tsang for his evaluation of the now surpassed GPT-3

Modern LLMs enable three key building blocks for next-generation Edtech products, including the ability to:

  • effectively summarize content - see for example OpenAI’s book summarization model
  • rapidly and cheaply create new copy/content with simple prompts
  • make a smooth human-like AI interface (in time, adapting the mix to each individual user) from a combination of synthetic AI video paired with the “human-like” dialogue capability of LLMs.

→ 2. Previous attempts were limited by content creation expenses

Previous attempts at AI-based personalised education were limited by the cost of creating educational content, partnering with existing content owners or depending on users to train AI models. Knewton, mentioned previously, in v1 built its own content as a proof of concept, then in v2 to scale, initially tried to partner with content providers such as Pearson to offer a “platform-as-a-service”. However, those partnerships did not always deliver as planned. They also tried going direct to educators, so teachers were asked to feed content into the solution to adapt the software to their syllabus. To extend the product beyond areas well covered by open access content, the “algorithm” had to be trained by the educators. Despite raising $180M, the system never managed to reach critical mass of content or fully deliver on its founder’s vision, that was simply ahead of its time. We believe that time has now come.

→ 3. Computers, tablets and smartphones are well-penetrated in most schools in developed countries

Following the recent pandemic, computers/tablets are also now widespread in households in most developed countries and across socioeconomic backgrounds. In the UK, 92% of students had access to educational apps in 2020 and the government distributed over ~2M tablets vs. a student population of ~10M during this time. In the US, ~86% of schools provided tablets in the pandemic vs. ~43% pre-pandemic. With tablets, mobile phones and computers already in the hands of students the market for EdTech is more addressable than ever.

Opportunities for founders and design considerations

The single biggest opportunity for AI in Edtech is to augment an overwhelmed base of teachers, to create a fully AI tutor. We believe providing a scalable high quality education would require stitching together a chat or voice interface to create a “human feeling” interaction, an LLM to create bespoke educational content (e.g. study materials, quizzes, study notes) and a personalization model to dynamically serve this content to the learner. Additional low-hanging fruit businesses we believe will be built include: spoken language practice bots to challenge existing “rent a native-speaker” marketplace models, AI-generated study materials, and a content creation assistant optimised to reduce the cost of creating courses.

To build a successful startup in this space, we believe founders will need to focus on three key design parameters: content comprehensiveness/quality, a D2C focus and customer acquisition (likely to be freemium). Efficient distribution is essential for scaling Edtech companies — right now, the most scalable acquisition channels we’ve seen are viral (eg Kahoot) and if you have to pay, we can recommend influencer marketing on segment-specific platforms (e.g. Tiktok or Snapchat). In terms of focus, we strongly prefer companies selling D2C, versus trying to selling to schools/universities/governments due to higher sales velocity. To ensure high quality content, startups should prioritize fine-tuning their models on verified educational content acquired through content partnerships or through building their own content moat by encouraging users to upload user-generated content (UGC). For the latter, we believe that a combination of a reputation system supplemented with AI models could be the most efficient approach to ensuring a high quality experience.

We have met numerous startups in this space over the last few months, and would like to invest in several of them in the coming months. So if you’re building an Edtech company employing LLMs or Generative AI, we’d love to hear from you.

If you think we’ve missed something in this post or would like to join the conversation, we’d welcome your input on Twitter.