What If Every Teacher Could Build an AI Tutor? David Wiley’s Generative Textbooks Idea Is Worth Your Attention

generative textbooks

There’s a particular kind of idea that shows up in education technology every few years — one that sounds almost too obvious once you hear it, but that nobody had quite put together that way before. David Wiley‘s work on generative textbooks is one such idea.

I’ve been following Wiley for a long time. If you’ve ever used an open textbook in a course or benefited from freely available educational materials online, there’s a good chance his fingerprints are on the infrastructure that made that possible. He’s one of the founders of the open educational resources movement — the effort to create, share, and freely adapt teaching and learning materials under open licenses. It’s unglamorous, important work that has saved students billions of dollars in textbook costs and given teachers genuine tools they can actually modify.

So when Wiley started applying that same philosophy to AI, I paid attention.


The Problem He’s Solving

The standard AI-in-education conversation goes like this: here are some tools (ChatGPT, Gemini, Claude, take your pick), and here are some ways teachers can use them. The tools belong to the companies. The teachers are users. If the company changes pricing, changes policy, or shuts down, the teacher starts over.

Wiley’s question is different: what if the instructional logic — the pedagogical intelligence built into an AI learning experience — belonged to the teacher? What if any educator could author an AI-powered learning tool without writing code, without a budget, and without surrendering control to a platform?

That’s what generative textbooks are attempting to answer.


How It Actually Works

The architecture is simpler than it sounds. A generative textbook isn’t a document — it’s a structured collection of inputs that, when assembled, tell an AI model exactly how to behave as a learning tool for a specific subject.

Here’s what an author creates:

  • A book-level prompt stub — the template that sets the AI’s voice, tone, format, and overall behavior. Think of this as the personality and ground rules of the learning experience.
  • Learning objectives — one per chapter or topic, short statements about what a learner should understand or be able to do.
  • Topic summaries — accurate, context-rich summaries written for the AI, not for students. These are what the model uses to stay grounded in accurate content rather than hallucinating.
  • Activity templates — the types of interactions available: flashcards, explanations, quiz questions, Socratic dialogue, whatever the author builds in.

When a student picks a topic and an activity type, the system assembles the relevant pieces into a single prompt and sends it to the language model, which generates a fresh, tailored learning experience — not retrieved from a database, but generated in the moment based on the author’s pedagogical structure.

As Wiley puts it: in this model, prompt engineering is instructional design. The authoring isn’t code — it’s curriculum work. That’s a meaningful distinction for teachers.


The Clever Pivot on Cost

The original prototype sent prompts through an API to open-weight language models hosted on Groq. Clean, seamless, technically elegant. Also not free — API calls cost money at scale, and Wiley found that most educators he consulted weren’t particularly concerned with whether the underlying model was “open” in the ideological sense. They were concerned with whether it was free for students.

So he made a pragmatic call: rather than routing prompts through a back-end service, the tool now assembles the prompt and copies it to the student’s clipboard. The student pastes it into whatever AI interface they already have access to — ChatGPT’s free tier, Gemini, a school-licensed model, whatever.

This is inelegant in the user-experience sense. There’s a copy-paste step that breaks the flow. Analytics become difficult. Student privacy depends on whatever tool they choose to use. Wiley is honest about all of this — he describes the project explicitly as a tech demonstration, not a finished product.

But there’s something worth noticing in the pragmatism. The decision prioritizes actual access over technical elegance. For students in districts that can’t afford platform licenses and teachers who don’t control their school’s technology budget, a tool that works with the free tier of a consumer AI product is more useful than a seamless experience behind a paywall.


Where Wiley Has Taken This Since

The generative textbook prototype was a starting point, and Wiley has kept building. His more recent thinking has evolved toward what he calls OELMs — Open Educational Language Models — a framework that combines open-licensed content with AI in a more sophisticated way.

The key addition is retrieval-augmented generation (RAG): rather than just grounding the AI’s behavior in a few paragraph-length topic summaries, an OELM includes a curated collection of OER content that the model actively retrieves from when generating responses. This makes the outputs more accurate, more traceable to specific source materials, and more trustworthy for educational use — one of the genuine limitations of relying on a general-purpose language model that might confabulate confidently.

The broader argument Wiley is making — that generative AI is the logical successor to OER — is worth sitting with. His claim isn’t that AI replaces open textbooks, but that the principles that made OER valuable (open licensing, participatory creation, the ability to adapt and remix) need to be extended into the AI space. As the educational materials market shifts toward AI-powered products, the question of who owns the instructional logic matters enormously for equity and access.


What This Means for Teachers

I want to be careful not to oversell where this project currently is. The generative textbooks site is live and explorable, but this is genuinely early-stage work. The copy-paste workflow has real friction. The quality of the learning experience depends heavily on the quality of the inputs a teacher creates, which means the authoring itself requires genuine pedagogical thought — garbage in, garbage out applies acutely here.

But the underlying question Wiley is raising is one I think about a lot as an instructional coach: who gets to design the learning experience, and on whose terms?

The dominant model in AI-powered education right now is platform-centric. A company builds an AI tool, schools license it, teachers become users. This mirrors exactly what happened with traditional educational technology — districts buy the LMS, teachers work inside it, the pedagogical architecture belongs to the vendor. We know how that story tends to go: cost escalation, lock-in, tools that don’t quite fit what teachers actually need because they were designed generically.

Wiley’s generative textbooks project is asking whether there’s another path — one where educators are architects rather than users. Where the instructional intelligence lives in open, adaptable, teacher-created structures rather than in proprietary platforms. Where a teacher in a school with limited resources can build a learning tool that’s as good as anything a well-funded district is paying for.

That’s not a modest ambition. And it’s not finished yet. But it’s the kind of work that tends to matter more than it seems to when it starts.


Go explore:


Related reading: my AI books post covers Ethan Mollick’s Co-Intelligence, which has useful framing for educators thinking about AI as a co-teacher rather than a replacement — a theme that runs directly through Wiley’s work.

You Might Be Trying to Replace the Wrong People with AI

I was at a leadership group and people were telling me “We think that with AI we can replace all of our junior people in our company.” I was like, “That’s the dumbest thing I’ve ever heard. They’re probably the least expensive employees you have, they’re the most leaned into your AI tools, and how’s that going to work when you go 10 years in the future and you have no one that has built up or learned anything?

So says Matt Garman, CEO of Amazon Web Services. A better question to ask: What do you mean, you don’t want to teach your high school students how to use AI to help them write code and solve problems more efficiently?

We live in weird times when people constantly retreat to what came before and avoid any intention of moving on.

Life is the future, not the past.



The Eclectic Educator is a free resource for everyone passionate about education and creativity. If you enjoy the content and want to support the newsletter, consider becoming a paid subscriber. Your support helps keep the insights and inspiration coming!