[Request] How much RAM would it take to run the world?

I think what you meant to ask is how powerful of an overall computer do we need, because in truth, RAM won't have a ton of impact. This is also all very theoretical, and with this many variables, doesn't have a real definite answer, but I'll explain what technologies would likely be used if we had to do this in modern day.

First off, this can't possibly be done on a single computer, which means we'll have to use several servers networked together via ethernet. The bandwidth of ethernet is almost nothing compared to the bandwidth of even older internal hardware, making real-time simulation almost impossible.

To determine the required processing power we can look at current applications.

Disney's Big Hero 6 drew attention in the tech industry for it's Hyperion global-lighting system. As I understand it, this simulator is currently the most advanced and accurate lighting simulator in existence. (Their system used a total of 55,000 Intel CPU cores spread across four coordinated servers just to compute the lighting.)[http://www.engadget.com/2014/10/18/disney-big-hero-6/] I'm unsure of exactly what processors they're using for their cluster, but I imagine it's high-end. Keep in mind that the models of Big Hero 6 are incredibly more simplistic than the objects that exist in reality, and that the Hyperion engine is just rendering the light for a small area at a time, typically a building and some of the building's surrounding exterior, which means we can expect the number of cores for reality to be much larger. I can't find numbers on exactly how long it takes to render each frames, but we can assume it takes a while. Also worth noting: this is just to make sure our lighting is calculated. It doesn't include physics at all.

So now we get into the physics part. I'll admit that I know very, very little about scientific physics engines, but I know that they tend to use PPUs (processors with thousands of small, weak cores that do a lot of simple equations very quickly by spreading out the work) to accelerate render times, similar to a GPU. It is very rare that these engines actually work on an atomic level, usually working with far more basic levels of tesselation in their collision checks. I can't find details on the hardware requirements for these, but chances are they are much smaller than Hyperion's. If we removed all life from Earth except trees, a PhysX simulation of the entire Earth could potentially be maybe possible if you found a way around the ethernet bottleneck and threw enough processing power behind it.

The last thing I can think of is sound. Very expansive soundscapes can be "rendered" from a DAW in real time, even with intensive effects, by specialized high-end workstations. However, this is usually working with less than a hundred 24-bit tracks, and these files are not "rendered" in 3D, and are absolutely not working at the atomic level. In truth, you aren't going to find anything right now that simulates actual sound waves in a 3D environment. Since light and sound can behave similarly, I'd assume that the requirements of this system could be similar to Hyperion's.

TL;DR With modern technology, you'd need an exorbinant number of processors just to render everything in much less than real time, and RAM capacity would have very little impact. RAM doesn't improve processing power, it improves the speed and amount of data that can be retrieved. Since you aren't rendering in real timw anyway, RAM is one of the least important factors in this kind of usecase.

/r/theydidthemath Thread