I see that we are back to the war on Christmas line of shit. They love to ignore the fact that it’s been taken over by capitalism for a long time now, is about nothing more than the money.
It seems that not a day goes by where I see another letter in the news paper saying that the United states needs to get back to God or it’s Christian roots. Frankly it’s disturbing how many people here in Pennsyltucky feel that their beliefs are the only ones that matter. No one says that everyone has to agree with others’ beliefs but it certainly won’t kill anyone to accept them and maybe learn from their differences.
I love how Christians claim that Christ is being taken out of Christmas. The gross commercialization of Christmas goes against everything Christ stood for, doesn’t it?
Dear Fundamentalist Christians. Please accept others for what they are. You know, like that fake guy you preach about. He’d forgive them.
Why do so many Christians feel the need to twist everything into being Christian? Not everything has to be about god.
The bible is the world’s most popular and dangerous work of fiction. That is all.