The myth of US major media being "liberal"
mercredi 18 novembre 2020 à 01:00The myth that US major media are "liberal" is believed in other countries as well as the US.
The major US media tend to to be center-right "centrists", but right-wing activists describe them as "liberal" so as to mislead people about where the center of public opinion is.