Stop Hiring Humans: Wait…WHAT?
On my way from the airport to the hotel in San Francisco, I passed a billboard that read, “Stop Hiring Humans.” It was an advertisement for an AI platform. I actually laughed out loud. Not because it was funny, but because it was so eerie. You see, I was on my way to the Learning and the Brain Conference, and this year's title was Teaching Generation AI-Z: Advancing learning in the age of AI, distractions, and uncertainty. And there it was. A bold public declaration that humans are somehow optional in industry.
Memory has always fascinated me. Some ideas dissipate in seconds, while others stay with us for decades. My mentor, Dr. Kieran O’Mahony, once told me that if I wanted to solve a problem, I should tell myself before bed that I would have the answer in the morning. I have tried it many times. It works. The insight often sneaks in as I wake up, but I have to act on it immediately, or the idea quickly fades. You see, the real challenge is not generating the idea. It is holding on to it. Writing it down. Turning it over in different contexts (I call this my gym thoughts or shower thoughts). Retrieving it later without notes. Connecting it in ways that make the learning transferable to future situations.
That rhythm of insight, effort, and retrieval has shaped how I consider my own learning and, therefore, my instructional design for the students and educators I serve. I care less about short-term performance and more about what lingers and how to make it stick. I often consider what will endure after the grade is posted, the students move to the next course, or the Neural Education Institute concludes. In a world where a billboard can suggest that humans are replaceable, this cognitive work seems urgent.
This was not my first time at a Learning and the Brain conference, and I feel so grateful to be able to return because of the connections that I am making. At last year’s conference in Boston, the theme was Teaching emotional brains: Strategies for student behavior, resilience, regulation, trauma, and emotional intelligence in challenging times. The conversations were rich and, at times, intense. I listened as researchers debated whether the Ebbinghaus Forgetting Curve was overstated or misunderstood. Some labeled it a neuro myth. Others defended it as essential for understanding retrieval practice. The arguments were thoughtful and very human.
But while the researchers were exploring the past, my mind kept moving toward the future. Generative AI was rapidly entering classrooms, lesson plans, and homework assignments. Students were beginning to outsource parts of their thinking with a few keystrokes. I found myself less interested in defending a curve drawn in the 1800s and more concerned with a modern question pressing on all of us. What happens to memory when students (and educators) no longer have to do the heavy cognitive lifting themselves?
By the time I arrived at this year’s conference in San Francisco, the question no longer felt theoretical. It felt urgent. As I walked up the hill toward the Fairmont, a Waymo car drove past me. No driver. No hands on the wheel. No hired human. The billboard I had seen on the way from the airport stopped feeling like creepy science fiction. The future of AI was not being explored only inside the conference. It was moving through traffic, obeying signals, and making turns.
At the conference, I attended a session by Dr. Rebecca Winthrop from the Center for Universal Education and the Brookings Institution. Her team conducted a yearlong premortem study on generative AI in schools. They interviewed hundreds of students, educators, and families, and reviewed hundreds of peer-reviewed studies. Their early findings were sobering: the risks of using generative AI in children’s education outweigh its benefits.
Suddenly, the debate about forgetting curves felt less central. If durable learning requires effort, meaning, and retrieval, then how we use AI becomes an extraordinary design opportunity. This is not simply a technology issue. It is a memory and brain plasticity issue. It is a profoundly human issue.
One line from Dr. Rebecca Winthrop’s session stayed with me. She said, “It is not too late. The future is still something we get to shape.”
A premortem imagines potential failure so we can prevent it. I left that session realizing that the responsibility rests with us, as educators. It lives in our instructional design choices, in what we ask students to wrestle with, in how we space and revisit ideas, and in how intentionally we build in the very human skills of collaboration, creativity, communication, and critical thinking.
My philosophy is shifting. I am no longer asking whether students should use AI. They already are. I am asking how we guide them to use it in ways that strengthen memory, deepen their humanity, and expand their sense of what is possible in their future careers.
The billboard declared that humans are optional. The educators, students, and parents in the Brookings Center for Universal Education study suggest otherwise. As educators, we design learning experiences that move fleeting thoughts into lasting neural architecture. In a world that will demand AI fluency, we have both the responsibility and the opportunity to cultivate classrooms where humans collaborate, think critically, and create with care. For these are the 21st century skills that make humans optional essential.