Owning Knowledge and AI

It is July 2025. “The MIT Article” is all anyone is talking about. This is the article on arXiv.org in which researchers compared the essays written by those using ChatGPT, web search, or only their brains. It is a long and interesting preprint article. The article is surely of interest to educators, surely an argument to be very cautious about using LLMs in teaching, and surely an article with very limited findings.

I am hoping readers are familiar with the study, but a quick summary for those who are unfamiliar: Fifty-four adults wrote three essays using a technology. Those who used no technology for the first three wrote a fourth essay (one they wrote on a topic previously written) using ChatGPT, and those who first wrote with LLMs wrote a fourth with their brains only. Other writers used a search engine to write only the first three essays. Humans and AI evaluated the essays and the brain waves of all writers were measured. The brain data comprises the bulk of the data, and many pages of graphs—by the way, it is fascinating reading!

I’ve been reading and writing a lot about the study, and I am finalizing the script for a podcast on the article. As I have interacted with the ideas that provide content in the article, its questions, the methods, data and its analysis, and the conclusions; it has struck me the data given the least space in the paper is the most important.

The essay writers were interviewed, briefly, about the essays. One of the questions asked about the essay writers’ sense of ownership of the content of the essays. The bulk of the data in the article focus on the evaluation of the essays and analysis of the brain waves, but it has occurred to me that a sense of ownership is the most important thing we can give our students.

In school, we typically develop knowledge and learn skills that are needed for transfer; while we learn and are evaluated in school, our knowledge is useful only when we can apply it in situations other than school. To be able to transfer the knowledge, we must be competent and we must own it.

Competence is easy to understand: We know what we need to and can do what we need to. It is a brain activity, but also affected by our environments. Our competence grows with the more we know and the more we can gather from information sources in the environment.

When we own knowledge, it has been integrated into our thinking; we literally perceive the world differently after we have integrated knowledge. We become critical of our knowledge when own it. We decide whether our knowledge is sufficient to solve the problem when we are critical of it. We can generate additional knowledge based on that which we own.

When viewed through the lens of ownership, I see the generative AI as no different from any other instructional tool or method. If students leave your class without a sense of ownership of the curriculum, then they have not learned what they needed to learn. Generative AI can be used to create essays (or other artifacts of learning) in which learners lack ownership, but so can learning that requires more traditional digital information sources and so can learning done with a brain only.