Why a simulation?

Welcome. I like your summary very well.

  1. Digital Consciousness: Something to keep in mind is that there is a very significant chance that artificial sentience and super intelligence will beat us to the punch. In other words, a living, conscious super intelligent being will be created, used, and trapped without rights or even recognition somewhere in cyberworld before we create a super bandwidth interface to effectively communicate with them.
  2. Biological-Machine Interface: The interfaces to our nervous system are still very crude. In order to have cyborg abilities, we will need machines that are partially organic or speak 'biochemistry' in a much more integrated way. Neuroengineers and brain surgeons will be very valuable in this endeavor.

I think digital sentience will be created before we create a truly effective BMI. Just to give you a sense of how crude our BMIs are, here is a description of an fMRI,

The main problem with fMRI is that each 3D “voxel” [volumetric pixel] averages the activity of 100,000+ neurons over one second, so around 1 million spikes are being collapsed into a single average. The signal also lags by one second, and the signal is very faint. In fact the signal is so faint that many experimental trials need to be averaged, and then tiny task-dependent 1% differences in “activity” are thresholded to create colorful brain-like plots.

The reason deciding what will happen first (i.e. whether digital super intelligence will emerge or effective BMIs will be developed) is very crucial to the conversation is because of the control problem that Elon Musk has brought up on several occasions. As is, if you survey the best AI, they are merged into ruthless techno-corps -- 'machines are natural psychopaths'. They are being programmed and trained on how to thrive in America by being merciless, aggressive, short-term thinking ( e.g. flash crash ).

Luckily, it seems like the last couple years has brought awareness and concern about the issue; however, if the culture and politics cannot be changed in a way to preserve democracy and humanity, then we will end up with being serfs serving lords that consult oracles and gods (i.e. super intelligent AI). A better transition to /r/transhumanism might be if we are able to create the BMIs before we become enslaved by a monumental disadvantage. Additionally, having some sort of checks and balances between super AI would be nice (e.g. 'democratizing AI').

Either way, the next step is that humans will become obsolete. Economically, we will have decided that machines have more value than humans. That will force humans to die out (e.g. not being able to afford insulin or dying of heart disease from only being able to afford cheap food or no food or committing suicide to escape unimaginable debt) or figure out how to merge with the machines.

Currently, techno-corps are the best example of merging with machines because they are feeding the machines massive amounts of resources in exchange for being rewarded by the machines. In the meantime, those that are not feeding the machine evolution, find out that they have no resources (since the resources are going to the machines and the compatible operators).

So... let's fast forward to imagining that some people made it through the transition to superintelligent AI overlords and some didn't. Well, perhaps people will have discovered BMIs by then, especially with the help of the superintelligent AI. The people that can afford the procedures will merge with the AI. The line between organic sentience and digital sentience would be very blurry with truly effective BMIs.

Basically, that leads us to the godlike singularity. It's hard to imagine what the goals of gods would be. I can imagine that it might be the comparison between your dog and you. In general, comparing species, we can see the biggest brains connect, control, and expand possibility. I would assume that there would be a lot of psychosis and the gods might be something like greek/roman gods. They would desperately try to constantly seek stimulation. I think there would be good and bad gods, but they would be different than humans.

/r/AWLIAS Thread