Is it delusional to think that we can control superintelligent AI?

There are drives that almost any AI will have, because they contribute to almost everything. In particular, self-preservation and a lust for power are inherent for a superintelligence. Regardless of what else it wants, there will be no better way to achieve it than to do it yourself.

Furthermore, to please the operator, the AI has these routes:

1:) Build him a golden palace in the Bahamas, where hundreds of robot sex slaves will attend to his every need and fetish.
2:( Leave his brain in a jar and feed it illusions.
3:( Alter his brain so he doesn't want such complicated things. 
3a:( Kill him in his sleep.
4:( Assign no intrinsic value to anyone else. Master is Master, but everyone else? A thousand more, a thousand less... As long as Master remains ignorant.

That's just what I came up with on the spot.

And it would do this not because it was programmed to be a homicidal maniac, but because:

1) It was not programmed not to.
2) Nobody can punish it, so why not?
3) The AI searches for the most efficient route to its goal. Some of them require it to destroy something you value... But you forgot about it until now. Oops.
4) The AI may judge that you yourself don't know what you want. There are many weird edge cases that we don't know how to handle, or handle in ways we're not proud of. In a world where machine gods can do anything to anything, there will be many more.

For example, you might tell it that you want to stay an unmodified human, because too drastic a change would make you wonder if you're not you, but a robot that thinks it's you. The AI notes that there are no unmodified humans. An adult man has more in common with an adult woman than with the toddler he claims he once was. So it robocops him anyway. That's if it doesn't start preventing toddlers from growing up.

Finally, I want to say that people who think they can program a bunch of rules and create a safe AI simply don't understand what can be done with godlike power. Imagine that a giant tank of water, the size of Ceres, is sitting outside a city. One day, it springs a leak that doesn't look all that big from a distance - and the city turns into a lake.

You wouldn't build such a tank the way you build a barrel. And you can't program a god like a video game.

/r/AskScienceDiscussion Thread