Introduction to "S-risks" - scenarios where civilization does not end with extinction, but endures indefinite severe suffering

There are a lot of stars in our galaxy.

Yes, but then it is not as astronomical as the writer says it is, my point is the article is stupidly written.

Likewise there is room for improvement in AI, quantum computing, photon computing, and other things.

Double sigh. Photon computing is not much faster than electron computing for starters. And quantum computers are not magic, they'll have to be developed a LOT before they can do what our silicone computers can do now barring some specialized tasks. Some researchers even doubt they'll ever completely replace silicon.

Note that humanity has already run into all sorts of limits, such as Mach 1 in air travel, limiting density of populations, sustainability limits for renewable resource extraction, and varying levels of depletion of unsustainable resources.

None of those were hard theoretical limits, that's a false equivalence. Also

E.g., someone in 1900 predicting that locomotives would never go faster than 100mph wouldn't realize that planes would take people on trips at 300mph instead. Likewise there is room for improvement in AI, quantum computing, photon computing, and other things.

In the 1900 the hard theoretical prediction would be that locomotives could go infinitely fast because of newtonian physics, so if anything theoretical limits have become more restrictive, not less.

You can have disconnected civilizations

True, but any of the measures in the dumb article are meaningless then, civilization's morals can diverge in as few as some hundreds of years. Also the article reads like the writer is thinking of a single connected civilization.

/r/Futurology Thread Parent Link - s-risks.org