Consciousness

The limits define the system. In a traditional digital computer, there are standards--standard architectures, standard protocols for input and output, etc. so programs and peripherals can be transferred and are interchangeable. Biology isn't really like that, Especially for more complex creatures, there's a lot of variety between individuals. We feel different things and we have different bodies and different neural circuits. I think consciousness is very much defined by what you put it in.

I don't think consciousness is a stand-alone object that can be transplanted. It's an emergent property of a giant feedback system which includes your body in the loop. In short, I don't think it's possible for a transplanted consciousness to remain the same consciousness it was before transplantation. If you were to put consciousness A and B, where B is the transplanted consciousness, into different timelines, I think you'd see the two diverge pretty quickly.

However, I think this divergence might only be noticeable to an outsider. Identity is a set of characteristics which appear to be preserved through time. Although all of your characteristics are always in flux, the brain reinforces neural circuits to represent those characteristics which don't change as much. This is just a representation. So you have a model of yourself. It can be wrong--we constantly do things that we think we'll like and want to do but we later find out that our idea of us is incorrect. Anyway, the fact that you are the same you from one moment to the next is just an approximation. Because it's approximate, maybe you'd round away the error of having a totally new body and you'd still feel like you.

The thing is, I think that although you might think that you are still the same you, the error of this approximation might be much greater than usual. You might feel this. If the error is too large, like if you were to try to exist inside of something very foreign, I don't think consciousness would be stable. If hunger is now the feeling of initiating a DHCP handshake and moving your left arm changes your clock speed, I think this would be incompatible with consciousness and you would no longer feel like you. Depending on the hardware you're put in (and I think AI is a hardware problem), you might be able to grow the neural circuits to generate a new consciousness. I think this might coexist with some of the relevant remnants of the old one. If the hardware can't evolve circuitry which contains a consistently relevant representation of your new reality which engages your old circuitry, I think you'd cease to be.

I work on some hardware which I think could be developed into the kind of thing you might upload a conciousness into. So I've given this a lot of thought. I really think that human vulnerabilities heavily define our human nature and our human conciousness. I don't think my personality would be stable in a machine that doesn't need sleep, or food, or sex, or that doesn't feel pain. I think if I were uploaded into such a thing, I could do terrible things. We like to think that empathy and compassion is just innate, but I think the pains that we feel keep us in check and prevent us from becoming monsters. In the end, I think I'm just a system that requires certain feedbacks (my body and its needs and senses) to remain stable. Removing or altering that feedback? I'm much more afraid of that than I am death.

Also, I don't really think it's possible to upload a consciousness. If we had hardware capable of human-like intelligence, you would need to raise it like offspring and maybe some of you would rub off on it. That's for a variety of reasons. One is because our neural networks are highly non-linear, often exhibit chaos, and of course each neuron is its own freakishly complex living organism. So it's a highly sensitive system. You need to copy it to perfect accuracy. I just don't expect a copy to be physically possible now or ever. Then you'd need to move that copy over into hardware which is similar enough to a brain for this highly sensitive system to not notice. The other reason is that I don't think human-like intelligence is possible in hardware which is programmable. It has to be emergent. This is something I might publish so I shouldn't go into much detail there.

/r/INTP Thread