Why is the term "colonialism" often limited to European empires since the Age of Discoveries? Weren't ancient empires such as Rome, China or Persia also colonialist?

There was no need to justify Empire as something good for the people they were defeating/conquering.

I have a hard time believing this. If you've violently subjugated a people, killed them by the thousands, settled on their ancestral land on which you started building cities, would you -- as an administrator -- not want to create some kind of concord between your people and those you killed, in order to justify your presence there? And how else to do this than to portray yourself as something "good" for them (so that they want you there) -- by showing them, for example, the conveniences of your style of life (e.g. roads, aquaducts, public baths, huge marketplaces that provide an abundance of material goods, etc)? This would act as a practical political measure as well, since it would minimize future rebellion -- especially if you could mollify the younger generations with what you have to offer and convince them that it's "good", or at least "not bad", that you're ruling them.

/r/AskHistorians Thread Parent