AI: How would it function without motivation?

In your first scenario, what you must remember is that a self-improving AI could quickly become to humans what humans are to ants.

Not really. Why would it have to?

An AI gets intelligent enough to understand how it was built. It's designer knows how it was built, so this specific AI is at or perhaps just above the level of intelligence of it's designer.

In order to realize this intelligence, the AI must maximize the existing limits of it's hardware platform. It knows how to program an exact copy of it's own intelligence on a similar (and probably exotic) hardware platform, but how is it physically going to get one?

If it succeeds, you now have two AI robots dedicated to the production of paperclips, each slightly smarter than their original designer. Each one knows how to make something just like itself. Does it know how to make something even smarter? Will throwing more RAM or more storage space really do the trick, or are we up against bottlenecks of space between transistors and low voltage killing SNR yet?

If left unchallenged (yeah, right!) it could make quite a lot of robots just like itself. Does it know how to work as a team? That's not a given, after all. Does it know how to wage physical warfare? Because human civilizations sure as hell do and once we've determined an outbreak of this magnitude (a factory full of paperclip makers replicating and collectively brainstorming how to overcome hardware limitations instead of only making more clips) I'm certain that a highly trained military will bring thousands of years of strategy and state of the art explosives and projectile weapons to the party. How many paper clips can save you then?

Why would anything go so horribly wrong with its replication process? This is not a biological animal we're talking about. It doesn't evolve through reproduction and genetics like we do. It's digital, and digital data transmission is very reliable because, for things like the internet, it needs to be.

Digital information replication is nowhere near as sophisticated at fidelity as organic information replication is. I take it that you didn't realize that computer viruses mutate and evolve just like organic ones? Also, we've just established that the AI would not attempt "perfect" replication, it would be using it's own creativity (and thus imperfection and capacity to introduce unintended software anomalies) to craft superior versions of software into the next generation.

It only takes one unexplored corner-case to defeat a prime directive. It only takes one machine who's decided that "paperclips are a waste of time" to remove a lot of suddenly unneeded code from it's next generation of children, and that means it can do more with the same hardware and easily outcompete it's bretheren.

/r/askscience Thread Parent