Thanks, Nathan for this nuanced take on writing and AI. It resonates deeply with my own year-long experiment co-writing a non-fiction newsletter about AI history with AI itself.
Your observation about finding your voice particularly strikes me. You write about how AI "won't make you read fragments" and tends toward those "long, lulling, lofty sentences." I think you've identified something crucial. One (over-simplified) way I've come to think about this is that AI seems to operate primarily in what neuroscientist Ian McGilchrist calls "left-brain" mode: explicit, categorical, disembodied. It struggles with the embodied, implicit, unique qualities that define authentic voice. (See Henry Shukman's book Original Love for a wonderful description of this.)
I share your concern about AI creating higher barriers for beginning writers. But I'd add another worry: even for experienced writers, I do believe there's something irreplaceable about what happens in our brains when we extract those "very personal waves of creative expression" onto the page. Lewis Hyde argues in The Gift that creative work carries the spirit of the giver, in other words, it's fundamentally different from commodity exchange—it's a gift that carries the spirit of the giver and that creates connections between humans that transcend transaction. When we dilute this, won't the writing and reading process become less about capturing our uniquely human experience? I think so. I also think our human experience is inexplicably intertwined with a relationship to technology, so I do see value in engaging technology in this process, but the line is so, so fine.
Your point about writing quality being an excellent benchmark for AI capability makes some sense to me. The models excel at information processing but struggle with what you call "intellectual play" - that fragment-using, punctuation-experimenting, voice-finding work that emerges from genuine care about something.
I've drawn a hard line at using AI for fiction writing, precisely because of what you describe as the dance of crystallizing ideas. That crystallization process—the struggle to find the right words for what we care about—I think is ultimately more important than the final product. But, then I guess that raises more questions about the reading process—as a tangent, I've come to find exploring dyslexia is an eye-opening glimpse into thinking about this differently than I would from my own (voracious) reader perspective.
I suppose, for me, the bottom line is that when we outsource that struggle (to find the right words for what we care about), we might lose more than efficiency; we might lose a fundamental way of knowing ourselves.
Then the question isn less about whether AI can write well enough to fool us, but whether the process of writing, that 45-minute dance you describe, is itself irreplaceable?