The Safety Demonstration
On AI, Trust, and the Norms We Build
I wrote the first version of this on a flight back to Sacramento from Orange County, somewhere between the safety demonstration and the moment they told us to put our tray tables up and seat backs forward for landing.
I mean that literally.
If you had been one of the flight attendants on that flight, you might have seen me mid-demonstration suddenly smack my forehead, mutter holy shit, yank out my laptop, and start hammering at the keys while you were still pointing to the exits. For a guy like me, that is close enough to an epiphany.
The thought did not come out of nowhere.
I had just spent three days at the Academic Senate for California Community Colleges Institute talking with faculty, administrators, and educators about AI, assessment, trust, authorship, labor, and what it means to teach in a world where language itself can now be generated on demand. I presented in two sessions of my own, both circling questions I cannot seem to leave alone: What happens when the artifact fails? What happens when trust in assessment begins to erode? What kinds of human judgment still matter when the thing in front of us no longer tells the whole story?
By the time the conference ended a little after noon, I was in that familiar post-conference state: exhausted in body, overstimulated in mind, still turning ideas over long after the event itself was done.
My flight was not until six, so I called my homie Gabe, who has lived in Orange County for about twenty-five years. I have known him since we were fifteen. I told him I had a few hours to kill.
“Costa Mesa is 30 minutes away,” he said.
“I’ll be there in ten.”
That’s Gabe.
There are people in your life who know how to meet a moment like that, and Gabe is one of them. After three days of presentations, networking, faculty talk, and professional performance, sometimes what you need is not one more esoteric conversation about AI. You need somebody who knew you before all of that. Somebody whose life took a different route but who built something real.
Gabe is one of the most successful people I have ever met, and what I love about his story is that it does not begin with prestige. It begins with a whim and lots of hard work. As a teenager, he walked into a new golf course in our hometown, asked for an application, and got hired on the spot. He picked the range, washed balls, parked carts, booked tee times, and learned the place from the ground up. He also took full advantage of the free golf. That part matters.
The rest of us got pulled in with him. We were blue-collar kids, sons of working men, out at Timber Creek whenever we could. Gabe taught us all how to play. For him, though, golf was never just recreation. It became craft, discipline, identity, a life.
He did not go the traditional college route, and I know class still leaves its mark there, because that is how class works in America. But he learned the game, the business, and the culture from the inside out. He earned his PGA card. Now, nearly thirty years later, he is the Director of Golf at Dove Canyon Golf Club, which is a hell of a long way from washing range balls as a kid.
Maybe that is part of why I needed to see him that day. Gabe reminds me that expertise does not always arrive in institutional language. Sometimes it comes from repetition, labor, and staying with a craft long enough that its rules become part of your body.
So yes, we drank beers. We talked shit. We decompressed. But by the time I was back on the plane half-listening to the safety demonstration, something clicked.
Every time we board a plane, a group of strangers stands in front of us and demonstrates how not to die.
That is what the safety speech is. Here is how to fasten the belt. Here is where the exits are. Here is how to brace. Here is what to do if the cabin loses pressure. Here is how to turn your seat cushion into a flotation device if things go very badly very fast.
And we accept all of it.
More than that, we barely notice that we accept it.
That is the remarkable part. Not that the ritual exists, but that it feels ordinary. A crew of people most of us have never met is rehearsing catastrophe in front of us while we sit in a metal tube ready to fly thousands of feet above the earth, and most of the cabin is half-checking email, opening snacks, or staring at a phone. The ritual has become so normal it no longer feels dramatic. It just feels like reality.
That, to me, is the power of a norm. It no longer has to persuade you every time. It gets into the body. It teaches you how to move.
Tray table up. Seat back straight. Phone in airplane mode. Seatbelt low and tight. Wait until the captain turns off the sign.
Nobody on that plane is having a philosophical debate about compliance. The system has already done its work. The rules feel real because the stakes feel real.
That matters, because it tells us something higher education has not fully reckoned with in the age of AI. Human beings are not uniformly anti-rule. When danger is legible enough, when a system is mature enough, and when expectations are repeated enough, people are pretty willing to live inside norms.
That is true of aviation. It is true of medicine, building codes, food safety, golf, and every other domain where people learned, often through error and consequence, that improvisation alone is not enough. We needed systems. We needed protocols. We needed habits strong enough to survive mood, ego, fatigue, and wishful thinking.
That is what makes AI in education so difficult to navigate. The stakes are real, but they are not yet collectively felt in the same way. The harms are diffuse. The benefits are uneven. There is no wreckage, no black box, no universally recognized image of failure. So instead of a mature culture, we have an unstable one: some institutions ban, some adopt, some panic, some sell, and most are still speaking in slogans. Slogans are a terrible substitute for culture.
The more I think about it, the more I think what we are living through is not just a technology shift. It is a norming crisis.
That feels especially true in writing instruction, where so much of what we do has long depended on quiet faith in the artifact. A student submits a paper. The paper becomes evidence. From that evidence, we infer thought, growth, judgment, rhetorical awareness, effort, maybe even readiness. We build rubrics, portfolios, placement systems, and entire assessment cultures around those inferences.
Composition, of all fields, should recognize the instability here.
We have our own rituals of norming. We have our own ways of turning expert agreement into institutional confidence. We have a long history of inferring cognition from products and treating those inferences as stable enough to organize real student lives.
AI exposes how fragile that confidence always was.
Because the artifact was never the whole story. It only looked that way because so much invisible labor sat quietly behind it: planning, initiating, organizing, sustaining attention, managing time, sequencing steps, revising, tolerating frustration, making choices, translating a foggy intuition into coherent language.
That is the elephant in the room.
We spend enormous energy talking about the product while leaving the deeper assumptions mostly untouched. We argue over the ant on the elephant’s toe while an entire cognitive infrastructure sits underneath the enterprise.
We grade the product, but we sort people by the invisible cognitive infrastructure required to produce it.
That sentence feels truer every year since genAI arrived. Not because AI destroys thinking, and not because every student text is now suspect. The problem is more unsettling than that. AI alters the relationship between process and product in ways that force older assumptions into view. The page is still there. The artifact still exists. But the old confidence that the finished thing gives us a reliable window into the human process behind it has started to wobble.
Once that wobble begins, the real questions rush in. What exactly were we trusting all this time? What counts as evidence of learning? What counts as support? What counts as authorship? What counts as revision? What counts as ethical assistance? What counts as a human voice when language itself is reproducible on demand?
Those are not merely technical questions. They are human, cultural, and institutional questions. They are questions about values and about what kinds of labor we are willing to see.
That is why I keep circling back to rules.
Not rules in the sterile bureaucratic sense. Rules in the deeper human sense. Rules as collective attempts to make risk manageable. Rules as rehearsed expressions of value. Rules as the thing communities build when they know enough to say: this matters, pay attention, here is how we move through this without pretending nothing has changed.
Right now, higher education is in the dangerous middle. The technology is already here, already ambient, already entering workflows, habits, and expectations, while the human conversation around it remains shallow, split, reactive, and underdeveloped. The danger is not spectacular. It is the slow normalization of habits, dependencies, and assessment assumptions we have not yet adequately examined.
To be fair, none of this is entirely new to composition. Writing studies has wrestled for decades with process and product, with assessment validity, with the instability of the artifact itself. AI did not invent those tensions. It intensified them.
That is why faculty matter so much right now.
Whether we like it or not, norms are already being built. They are being built by repetition, by software defaults, by convenience, by student habit, by administrator pressure, by vendor design, by fatigue, by quiet accommodation, by people simply trying to get through the week.
The only real question is whether faculty are going to help shape those norms while that shaping is still possible.
I am not arguing that education should imitate aviation literally. Classrooms are not cockpits. Students are not passengers. Learning is not an emergency procedure. But faculty live closest to the actual human questions this technology keeps forcing into the open. We are the ones who deal with process, voice, revision, struggle, scaffolding, misunderstanding, growth, avoidance, courage, performance, and language. We are the ones who know that humans do not simply produce artifacts out of nowhere.
Maybe that is why my visit with Gabe stayed with me on the flight too.
Earlier that afternoon I had been sitting with someone who built a life through repetition, standards, labor, and practiced feel, and now I was watching another set of professionals move through a ritual so ingrained most people barely noticed it. Different worlds, same truth: mastery settles into the body before it ever settles into language.
Maybe that is what had been building all day without my fully realizing it. Three days at ASCCC talking about AI, trust, assessment, and the future of teaching. A few afternoon beers with one of my oldest friends, whose whole life is a case for earned mastery. Then a flight crew calmly walking a plane full of people through the oldest lesson in any high-stakes system: rules only matter if a culture exists to make them real.
That is the work in front of us now.
Not deciding once and for all whether AI is good or bad. That question is too blunt to hold what is happening. The harder work is cultural work. It is the work of making visible what is on its way to becoming invisible. It is the work of naming what kinds of help remain helpful, what kinds of outsourcing alter the task itself, what kinds of disclosure matter, what kinds of assessment still give us meaningful glimpses of learning, and what kinds of habits we want students to build while all of this is still in motion.
By the time the plane leveled off and the captain was announcing beverage service, the safety demonstration was long over. Most people had tuned it out before it finished, but the ritual had done its work anyway. That is what successful norms do. They become ordinary enough to disappear.
Gabe knows this. The flight attendants know this. Higher Ed needs to learn it again.
There is still time, maybe not much, but still time, to decide what kinds of norms we want to build before they settle into the body and start calling themselves common sense.


