The app for independent voices

If your definition of “consciousness”, whatever that is, implies that current LLMs, no matter how large, can be conscious, then you have to accept that a sufficiently large (but not infinite) lookup table is also conscious.

This is what mainstream LLMs are in the end, a very very complicated way to compress a huge, but ultimately finite lookup table that maps every possible input string up to some finite maximum context size (say, every string up to 1 million words) times every possible word (say 200 thousand tokens), to a number that indicates the odds that such a word is the most appropriate to continue that context.

This is huge. Something in the order of 2 to power of a few million bytes. Humongously large. But finite nonetheless.

This is in no way an oversimplification of what LLMs do. It's a marvel of engineering that such a humongous lookup table can be, first, stored with such high fidelity in a relatively tiny program (the largest LLM weights fit in a few hundred gigabytes), and, even more impressive, that we can synthesize the correct compressed lookup table (or a very close approximation) from textual data alone.

But in the end, that is the underlying mechanism.

So if you think LLMs are conscious, then your definition of consciousness amounts to very very effective compression.

Perhaps that's all there is. But I think most philosophers would not accept that.

Jan 29
at
12:51 AM

Log in or sign up

Join the most interesting and insightful discussions.