Recent fiction I've been reading/watching has included religion as part of the plot. Typically, the religious leaders are shown to be greedy and willfuly deceptive, while the "true believers" are simply woefully ignorant suckers. I'm of two minds about this (which, when you're a brain in a jar, is tricky at best):
First of all, the religions usually depicted as fraudulent are, in fact, phony religions, and as a follower of Jesus, I would expect this to be the case. Additionally, sometimes the church is personified in fiction by its worst proponents. As a lover of truth, I like that. Evil shouldn't be allowed to continue, simply because it wears the guise of faith.
It also seems that when these fake, oppressive religions turn up in plots, there is no alternative to it but utter atheism. In other words, the author(s) seem to be saying, "All religions/faiths are like this phony one I've cooked up here." And, in some cases, knowing the authors, that is exactly what they mean. Of course, if I had only had negative experiences with faith, I might be more accepting of this approach.
What do you think? Is no religion in fiction better than negative depictions of it?
Your comments welcome.