In his 2007 book I Am a Strange Loop, author Douglas Hofstadter–best known for writing the Pulitzer Prize-winning book Gödel, Escher, Bach: An Eternal Golden Braid–discusses his reaction to the untimely death of his wife Carol, due to a brain tumor, while on sabbatical together with their children in Italy in 1993. If you’re interested in simulation and how it might work within the human brain, I highly recommend reading this book, and especially the chapter devoted to this subject. He writes:
I realized… that although Carol had died, [a] core piece of her had not died at all, but that it lived on very determinedly in my brain…
The name “Carol” denotes, for me, far more than just a body, which is now gone, but rather a very vast pattern, a style, a set of things including memories, hopes, dreams, beliefs, loves, reactions to music, sense of humor, self-doubt, generosity, compassion, and so on. Those things are to some extent sharable, objective, and multiply instantiatable, a bit like software on a diskette.
This is interesting enough, but Hofstadter goes further and asserts not only that his wife’s “very vast pattern” exists, at least in piecemeal form, in a variety of locations, including within his brain, but that this pattern is capable of executing itself (or being executed):
Along with Carol’s desires, hopes, and so on, her own personal sense of “I” is represented in my brain, because I was so close to her, because I empathized so deeply with her, co-felt so many things with her, was so able to see things from inside her point of view when we spoke, whether it was her physical sufferings… or her greatest joys… or her fondest hopes or her reactions to movies or whatever.
For brief periods of time in conversations, or even in nonverbal moments of intense feeling, I was Carol, just as, at times, she was Doug. So her “personal gemma” (to borrow Stanislaw Lem’s term in his story “Non Serviam”) had brought into existence a somewhat blurry, coarse-grained copy of itself inside my brain, had created a secondary Gödelian swirl inside my brain (the primary one of course being my own self-swirl), a Gödelian swirl that allowed me to be her, or, said otherwise, a Gödelian swirl that allowed her self, her personal gemma, to ride (in simplified form) on my hardware.
In other words, without conscious effort, Hofstadter was running a “coarse-grained” simulation of his wife inside his own brain. I think this is a profound insight. If you’re married or similarly partnered, and especially if you’ve been partnered a long time, I’m willing to bet that you do the same all the time–that without bidding, you imagine how your partner might react to a certain situation. And that’s because, over the years, your subconscious has constructed a simulation of his or her personality that has gotten better and better (e.g., resembled reality more and more closely). And you don’t have conscious control over the simulations your subconscious is constantly running. None of us does.
I find it interesting that in the book, Hofstadter only uses the word “simulation” twice, and in a curious tone:
Without going into more detail, let me simply say that it makes perfect sense to discuss living animals and self-guiding robots in the same part of this book, for today’s technological achievements are bringing us ever closer to understanding what goes on in living systems that survive in complex environments. Such successes give the lie to the tired dogma endlessly repeated by John Searle that computers are forever doomed to mere “simulation” of the processes of life. If an automaton can drive itself a distance of two hundred miles across a tremendously forbidding desert terrain, how can this feat be called merely a “simulation”? It is certainly as genuine an act of survival in a hostile environment as that of a mosquito flying about a room and avoiding being swatted.
In the same paragraph, Hofstadter is both a) arguing that artificial brains carry no fundamental traits that would prevent them from being considered life forms and b) (seemingly) using the word “simulation” as a pejorative. Yet what is the “coarse-grained copy of itself” created by the “personal gemma” of his former wife that “rides” on the “hardware” of his brain but a simulation?
If we begin thinking of brains as multi-layered, overlapping, nesting simulations, both hierarchical and non-hierarchical–if we think of brains as simulation engines–then the idea of carrying around a coarse-grained copy of the person we know and love the most, a copy that, unbidden, is invoked (or invokes itself), becomes much easier to understand and accept. And the word “simulation” no longer has a negative connotation in the context of discussing cognition, as well it should not.