Peter Watts has this short short story about a brain interface technology that allows people to merge their consciousness with other organisms — and in this one, “Colony Creature”, someone experiences what it is like to be an octopus, and is horrified by it.
“Those arms.” His Adam’s apple bobbed in his throat. “Those fucking crawly arms. You know, that thing they call the brain— it’s nothing, really. Ring of neurons around the esophagus, basically just a router. Most of the nervous system’s in the arms, and those arms… every one of them is awake…”
It’s a good story, and I’m not knocking it. I think it’s also important to recognize that the experience of being a non-human organism is probably fundamentally different than being a human.
However, while it’s true that 2/3 of the neurons in an octopus are found in the arms, not the brain, I doubt that the experience of being an octopus is quite so distributed, or that it consists of eight independent conscious entities.
For example, your eyes are actually complex outpocketings of your brain — each one contains at least half a billion neurons interconnected with some very intricate circuitry. Likewise, your enteric nervous system — the neurons that drive the activity of your gut — consist of about a half billion cells, too. These things are bigger and more complex than the nervous system of my larval zebrafish, which can swim and learn and eat and carry out goal-oriented behavior.
If a Vulcan mind-melded with a human, do you think that they would report that the human experience is all about these creepy autonomous orbs on their face, darting about and collecting photons, and that we’ve all got this colossal slimy worm hanging from our heads, writhing and squirming and demanding to be fed? It would make a good creepy story, but it’s not an accurate picture of our conscious experience.
I’m not sure what consciousness is, but it’s almost certainly got to be a synthetic process that integrates complex information from multiple sources, guts and sensory organs and spinal cord and multiple modalities, and is also built up from different processing centers within the brain. I would expect the same is true of the octopus.
Although…a talented writer could probably put together a good horrifying story of a person with a fractured brain, where they became aware of all the separate components of the inputs to the human experience as if they were independent intelligent agents with limited and specific needs having conversations with a central ‘self’ (another problem with the Watts story: what is the octopus ‘self’ that the story-teller has become?). What do my eyeballs want? What is my amygdala muttering to itself? I know what my gonads want, and they better behave.
johnmarley says
His interpretation of being an octopus is almost identical to David Brin’s description of the Jophur/Traeki from the Uplift series.
Marcus Ranum says
Think how fun it’ll be for the “uploaders” if their fantasies come true – sensory systems would “feel” like… what? OMG my left grasping actuator is sending me a signal that feels like a drill going into the bone of my “thumb” please adjust it asap.
This whole issue is part of the fascinating discussion that happens when people make the mistake of thinking that they are not their bodies and/or that their brains aren’t their bodies. The brain is so interdependent on the rest of the body (if you don’t believe me, fuck up your adrenal gland!) it’s not practically separable. They’re interdependent. Ditto the weird notion that humans are separable from their biome. Well, I mean, sort of. But long term? You’re no longer human, you’re something else.
consciousness razor says
Well, it’s not like we’re just aware of numerous separate phenomena, each of which gets measured independently and sent to a data storage facility somewhere. You’re integrating that stuff into a coherent perspective which places you in the center of the action. If you see a predator approaching, your experience isn’t (usually) a “fractured” array of colors, shapes, textures, motions, sounds, etc. You see a tiger (and some grass, trees, etc.), and you probably get the feeling that something bad is about to happen to you. You don’t have to stop and think that if your arm is gnawed off, it would be a problem for an arm that happens to be in your vicinity, then conclude that it would be a problem for you. You’ve had some time (or our species has had time) to develop a sense that it’s an extension of what you are, not just literally but within your experience as well. It gets moved out of danger, because you want to move yourself out of danger.
It seems very useful that we don’t have to do all of these separate, tedious bits of calculation consciously, every time we do or think anything. And there aren’t really a bunch of competing conscious entities that have to engage in a dialogue with each other or something like that. That’s because we have this unified experience of a self, in which the brain somehow throws it all into a blender and is able to present the resulting glop to itself on a nice shiny platter. The brain metabolizes this glop into useful behaviors or regulatory functions. I mean, if you want to use a different metaphor I guess you could say the parts are “talking” to each other during the blending, but it’s not like each of them knows what it’s talking about or is aware that it is something which has something to say. That only gradually builds up as the bits get put together and used.
Some people, however, do have a huge variety of different kinds of fractured experiences. There are lots of disorders out there demonstrating that, and people have written lots of (often horrifying) stories about them. Of course it doesn’t follow that those parts of our brains are agents themselves because they can act independently, or because the (ordinarily unified) agent which results from their operation is made of those parts working together. Whatever an exact definition of selfhood or agency should be like, people certainly don’t mean that my left ring finger or a region in my visual cortex counts as one of them. Do they? I mean, if anybody does mean that, it’s a pretty wacky idea.
Anyway, it makes sense that the unity of consciousness is adaptive for us, right? Why wouldn’t an octopus be in a similar situation? Are you suggesting there’s no rudimentary kind of self that an octopus experiences, PZ? Or that you don’t think there would be eight “tentacle agents,” for example, with even more agencies generated by other parts of its nervous system? That latter sounds pretty ridiculous to me. But the former is just something I’m not totally sure about.
Artor says
Of course, an octopus develops from it’s egg with 8 arms to integrate into it’s consciousness, whatever that may be. As a human with our measly, jointed, centrally-controlled four limbs, suddenly being gifted with 8 semi-autonomous ones, and having neural feedback from all of them, seems like it would be extremely disorienting. Those arms!
screechymonkey says
It’s funny how a lot of good and interesting sci-fi stories use futuristic technology but implicitly depend on an already outdated philosophical notion (mind/body duality). “An accident with the positronic antimatter renobulator causes the Captain’s mind to be stuck in Ensign Redshirt’s body” is sort of the equivalent of saying “the gene therapy we designed to fight your cancer has caused an imbalance of your bodily humours.”
edmond says
Reminds me of the John Varley trilogy of books, Titan, Wizard and Demon. Astronauts visit an alien creature in orbit of Saturn whose body is a giant organic, wheel-shaped space station, with other living creatures inside. Although there is a central brain which controls the organism, each segment of the wheel has it’s own brain to help control the giant being. As the stories progress, the entity goes slowly insane, and the separate brains become their own individuals, some insane themselves, and some which conspire with the human characters against the central brain. Very fun story, which of course has been made heavily obsolete by technology which is more modern than that which existed when the books were written. Having separate pieces of your brain able to consciously revolt against you would be disturbing to say the least.
unclefrogy says
having observed infant human beings as well as other infant animals, it sure looks like what they are experiencing are a lot of semi-autonomous entities going off on their own and learning how to integrate them into some kind of “order”.
Some seem to be able to “gain control” of their bodies rather quickly like prey animals others take some time to integrate it all. Of all the animals we seem to Itake the longest. That process is intriguing what is going on actually? is consciousness being built up or is it being added to?
uncle frogy
Kagehi says
</blockquoteOf course, an octopus develops from it’s egg with 8 arms to integrate into it’s consciousness, whatever that may be.
laurentweppe says
You don’t need a fractured brain.
Ever heard of umbilical hernias? Having one is exactly like having a one-sided “conversation” with a very pissed digestive track that’s shouting “I hurt, and I fucking hate you, so now I’m on strike and I’m going to make you vomit all the half-digested shit that I refuse to process anymore and Fuck You“
smike says
Coincidentally, last night I was pondering consciousness and what life is while watching a science (astronomy) show. When the narrator stated that nothing can exist in such and such an environment, I had to assume that s/he was referring to life as we know it. This, to me, is problematic. We may never find ‘other’ life if we are simply looking for ourselves in far away places.
paulparnell says
Um, the thing is the octopus arms could all have separate consciousnesses and yet the there may not be any shared consciousness.
Similarly all the individual brain structures in our brain could in principle perceive themselves as conscious entities accomplishing their function by conscious will. That does not necessarily mean that we as individuals will experience all the conscious entities that we are made of.
I could imagine a group mind made a thousands of eusocial insects but that does not require that the hive mind is conscious of the individual insect consciousness.
Until we know what consciousness is we should be careful about what we claim is or is not conscious.
Matthew Trevor says
No discussion speculating on the consciousness of non-human entities is complete without the obligatory reference to Nagel’s What is it like to be a bat?
Menyambal says
Terry Pratchett covered some of this in his Discworld novels. Granny Weatherwax “borrows” the minds of other creatures, even feeling the consciousness of a hive of bees. In _Reaper_Man_, Windle Poons doesn’t die, and has to learn to manage all of his body’s automatic functions.
Anri says
One of the Known Space races is the Jotoki – they (if I recall correctly) start off as 5 independently living, basically unintelligent eel-like creatures that join together at the heads to form something rather starfish-esque and link their neural networks to produce a committee-like arrangement. It’s not a single personality exactly, but not exactly independent personalities, either.
I seem to recall that most of the tech the Kzin have currently, they didn’t invent themselves, but stole from these guys.