One thing that's really been irritating me is, usually in editorials and opinion pieces, "the West" is treated like its own country. Yes, of course there are overlapping values and traditions in the Western world but not to the extent a lot of people think.
Going along with that, it seems like every time someone makes a reference to "the West," it's negative.