Here is the uncomfortable part of AI extending ourselves: what if you’re an asshole or an idiot, or, to be more kind, varying degrees of ignorant? If the amplification depends entirely on the quality of the beliefs brought to it, there are a lot of really bad and misinformed beliefs. A tool that extends critical inquiry produces insight, at least in theory. A tool that extends the reach of unexamined assumptions, biases, and self-serving narratives feeds the expansion of ignorance to a place where it resists correction precisely because it has started sounding more intelligent.