Computing in the 90's VS computing in 2018

While technically true, it's also technically true that in many situations, a lot of data isn't necessary and can safely be discarded. Just look at CERN for example. They have to throw away most of the data they generate, because it's literally impossible to store it all (and most of it's uninteresting).

A more every day example is the gigabytes of pre-rendered lighting, shading, and reflection textures that are the reason why most games are so massive and take so long to load, even with an SSD. If we didn't care so much about shiny things, we could go back to making video games, instead of loading simulators and interactive movies.

Also, most websites perform like ass because they are loading dozens of mostly unused JavaScript libraries that are sewn together into a patchwork quilt, in order to do something that should only take a few lines of code, but the person in charge of the project doesn't actually know how to code and only got the job because of nepotism.

Don't get me wrong, the idea that you can compress terabytes into megabytes is total sci-fi, but that doesn't mean we can't cut out the fat and make stuff more efficient. There's no real reason for most things to be as bloated as they are, other than incompetence, laziness, or some other human flaw.

/r/gaming Thread Parent Link - i.redd.it