One of the interesting ideas cognitive literary theory exposed to me was the extent to which our cognition is embodied; like, some of this is just trivially obvious, in that things like our sense of direction or our ability to solve physical puzzles are dependent on our perceptions of geometry and angles and proportions. But it does run deeper than that, with things like the method of loci using our spatial reasoning ability to enhance our memory, two things which are not obviously related to one another. Even that, however, is scratching the surface. When you try to delve into the mechanics of how language slices the world, both in the concrete and the abstract, you find that ultimately all human language is physical or sensory and only from there extended by analogy to the abstract and philosophical. There are words whose obvious meanings are both physical and abstract (network, bridge, impression, depth) but few or none that run in the other direction. (Words relating to time don’t count, since that’s something we perceive with our senses, even if it’s not directly physical–still part of embodied cognition. Likewise emotions, which we physically locate in our bodies even if they exist only in our mind.) The weak version of this statement would admit there are words for abstract concepts whose etymology is non-abstract (institution is a word that springs to mind), but arguments from the history of words don’t actually tell us much about how words work now (the language of individual persons is synchronic, not diachronic; language change has no memory). But I think the strong version holds as well: words like “institution” refer to groupings which naturally arise out of our sensory impressions, in the same way we look at a flock of birds and see it as one thing, albeit one with readily discerned components.
If you don’t mind rhetorically overblown statements, you could say that the human mind is dependent on far more than just the brain. Subjectively, at least, the mechanics of our cognition stretch out into the world around us.
All of that’s well and good when it comes to literary criticism, especially where it involves picking apart the minutiae of texts, but what really interests me about this is what it says for our ability to understand beings with cognition unlike our own. It’s not immediately obvious, for instance, why we should be able to, not translate, but actually understand the Epic of Gilgamesh, which is about as remote from our day-to-day experiences as you can get, culturally speaking, being written in a dead language by people with completely different day-to-day experiences and with virtually zero shared literary/religious references. But we do, because human cognition hasn’t changed much in the millennia since it was written, and there are common landmarks we can point to which structure our understanding of the world in the same way they structured Sin-Leqi-Unninni’s.
What if those referents don’t exist? What of creatures that had, not just differently organized brains, but different senses, different linguistic universals, a different relationship to their physical world–or, in the case of an intelligence implemented as a computer program, none at all? Does it follow that we would have much in common with them–indeed, does it follow that we could understand them at all?
I don’t think it does. In fact, I think it requires assuming a lot which is actually up for grabs. Even if trends in terrestrial evolution, like cephalization and an organized nervous system and cellular biology hold in other contexts, even if they’re as natural a consequence of the physical laws of the universe as breathing oxygen to run your metabolism, you could have fundamentally creatures whose languages are simply unlearnable for humans, who have nothing of interest in the realms of art or philosophy to share with us, nor us with them, simply for the reason that our experiences of the universe are too different. I’m not saying we couldn’t communicate at all. We would deduce the same physical laws; we would probably be able to work out a you-fish-on-your-side-of-the-lake-and-I-fish-on-mine-and-nobody-fishes-in-the-middle-type agreement to live and let live, and maybe even some basic forms of trade, but actual communion could remain forever out of reach.
(And is this the solution to Fermi’s paradox? That alien intelligence is so different from our own we can’t recognize it from here even if it *is* leaving footprints all over the universe? I don’t think it’s likely, but I do think it’s possible.)
But I also think this is why we can’t assume the possibility of an artificial intelligence that we can have any kind of meaningful communication with, never mind uploading a human mind. Even if you could implement a very good simulation of the human brain in a computer, a human mind unmoored from its body might not think the same way, might not have the same relationship with the world, might have a very different internal experience from anything the embodied human has, and the closer you got to bridging the gap the more you would just be simulating a physical world that was inhabited in the same way we already inhabit our world. If that mind could copy itself, merge itself, alter itself, then the differences would be even greater. If that mind was built from the ground up, intelligence developed on its own terms rather than those of DNA and cellular biology, why should it have anything in common with us at all? It would be as alien as any other kind of intelligence, and while perhaps we could customize it to do useful work for us, I’m still not sure its internal experience would ever be something comprehensible or of interest to us, and vice-versa.
I don’t think this means uploaded minds or entirely artificial intelligence are impossible, or even necessarily inadvisable. Just that there are aspects of discussions of the glorious post-scarcity transhuman future that remind me of people in 1950 predicting a society in 2020 with the same political and gender relations, or science fiction of the 1920s that predicts a universe populated with American-accented rubber forehead aliens whose societies are organized exactly like Earth’s. If the past has any lesson to offer, it is that the future is going to be a lot weirder than we imagine, or even than we can imagine.