I’ve heard many people claim that Glee teaches anti-Christian morals, well I’ve just spent the entire weekend watching season one of the Glee, and I’m calling BS! Glee doesn’t “teach” us anything about morals. It’s a television show centered around high school students, and they keep it pretty realistic (within consideration).
We can’t all live in fairytale land where everything is happy and dandy. Sometimes bad stuff happens, and this show doesn’t shy away from it. There is a Christian family on the show how kicks their teenage daughter out after they find out she’s pregnant, as wrong as that is… that happens. There are parents out there who claim to be Christians but are more worried about their image then their children. Is it wrong of Glee to show that? Not at all.