Hollywood and the movie industry are stereotyped as overwhelmingly "liberal." What in the history of cinema led to Hollywood being seen as so left-wing?

According to the introductory chapter of "Movies and Mass Culture" by John Belton, early cinema came to America during a challenging political time. The nation was still feeling tension from the post-civil war era and the expanded federal government that immerged from it. America was also entering an age of industrialization that dramatically changed the world, seemingly overnight.

During this time, American political thought was being largely shaped by two major movements - "Populism and Progressivism." You can see this in early American cinema, such as in the progressive "A Corner of Wheat" (1909), which focuses on the plight of rural farmers, or in the populist message of "Birth of a Nation" (1915), which focused on the perceived need for a return to of white supremacy in the American South. Most of the popular films in early American cinema had a message the belonged in one of these political camps, and (according to Belton) this is still seen in much of Holywood.

While what is understood as "populist" and "progressive" has changed, Belton argues that these ideologies are still hardwired into how the majority of American's approach film. According to Belton, Hollywood films usually either confirm and/or resist dominant ideology. If true, one could argue that Hollywood has always had a progressive agenda and a populist agenda, though the populist agenda usually receives less attention simply because it typically appeals to the majority and does not challenge its audience in the same way that the progressive agenda might.

/r/AskHistorians Thread