I'm bringing a topic to the table that's pretty important. Understandably, you probably entered this discussion because you are quite annoyed at what the Christian right wing does. You probably hate all the hypocritical crap that religious people do...saying not to do one thing and then doing the 'sin' themselves...and how they blame all the evil in this world on the anti-Christ that is composed of homosexuals, Hollywood, and Democrats. Ticked off?
Me too, and I even grew up in the church.
And I still go to church. On my own.
Unfortunately, Republican agenda (specifically, the link with GWB) and other stupid actions have completely skewed what true Christianity is about. It's perceived as this go-to-church-or-go-to-hell message that is SO WRONG. The Bible says that Jesus is really about love, compassion, forgiveness, and sacrifice...and the ONE purpose of a Christian is to surrender his life to those teachings.
But it's instead perceived as a kitschy group of bellyaching, hypocritical, fire-and-brimstone churches that serve to suck in emotionally depraved people. Then they can be drained of their money to lift child molesting pastors into the stratosphere of economic prosperity.
This really hurts me because I have a personal relationship with God (which is the heart of true, honest, Biblical Christianity), and it is so different than what most of the world thinks. I want to change that perspective...to heal our image...but I'm trying to figure out how to do it without setting off people's "religion propaganda" alarms. I find there are a few things that often tick people off about Christian messages in films.
1) They come up from behind and rape you with no warning, leaving the audience feeling cheated.
2) They don't take into account authentic human emotions and aren't relevant to people.
3) They PREACH. Do people really preach to each other in life?!
4) The stories in these films SUCK or are BORING!
5) The style of the film is so "inspirational" (ie cheesy in a bad way) that they make you make you want to watch a soft core horror flick just to balance out the queasy feeling you get inside.
6) They're about the end times (puh-leez that's not what the Bible's about!)
7) They threaten you with the go-to-church-or-go-to-hell message
No humor...at all... Isn't most of life here on earth a joke anyway? (this statement actually agrees with the Bible, believe it or not)
I show my faith in my films, as it is. Jesus Christ is so very special to me, in that following him has completely changed my life. Why wouldn't I want to tell the world about it? Isn't art a form of self-expression? If I didn't put a strong element of change through faith in my characters, I would deny the very core of my existence. My art would become cheap.
I'm trying to make films that are able to authentically express what I feel about faith, but a lot of people in the independent film community seem so hair-triggered to anything that even smells of Christianity. How do I portray my personal faith without getting snickered into a hole of being a right-winger? My goal is not to preach or get people 'converted.' I only want to show a perspective on life that is sorely missed by a great deal of humanity. Isn't that what films are supposed to do? To provide new experiences that you've never had... to realize authentically what goes on in peoples' lives other than your own...
I just watched "Pieces of April." That's probably closer to the Christian faith than many elements of the Passion of the Christ IMO.
Me too, and I even grew up in the church.
And I still go to church. On my own.
Unfortunately, Republican agenda (specifically, the link with GWB) and other stupid actions have completely skewed what true Christianity is about. It's perceived as this go-to-church-or-go-to-hell message that is SO WRONG. The Bible says that Jesus is really about love, compassion, forgiveness, and sacrifice...and the ONE purpose of a Christian is to surrender his life to those teachings.
But it's instead perceived as a kitschy group of bellyaching, hypocritical, fire-and-brimstone churches that serve to suck in emotionally depraved people. Then they can be drained of their money to lift child molesting pastors into the stratosphere of economic prosperity.
This really hurts me because I have a personal relationship with God (which is the heart of true, honest, Biblical Christianity), and it is so different than what most of the world thinks. I want to change that perspective...to heal our image...but I'm trying to figure out how to do it without setting off people's "religion propaganda" alarms. I find there are a few things that often tick people off about Christian messages in films.
1) They come up from behind and rape you with no warning, leaving the audience feeling cheated.
2) They don't take into account authentic human emotions and aren't relevant to people.
3) They PREACH. Do people really preach to each other in life?!
4) The stories in these films SUCK or are BORING!
5) The style of the film is so "inspirational" (ie cheesy in a bad way) that they make you make you want to watch a soft core horror flick just to balance out the queasy feeling you get inside.
6) They're about the end times (puh-leez that's not what the Bible's about!)
7) They threaten you with the go-to-church-or-go-to-hell message
No humor...at all... Isn't most of life here on earth a joke anyway? (this statement actually agrees with the Bible, believe it or not)
I show my faith in my films, as it is. Jesus Christ is so very special to me, in that following him has completely changed my life. Why wouldn't I want to tell the world about it? Isn't art a form of self-expression? If I didn't put a strong element of change through faith in my characters, I would deny the very core of my existence. My art would become cheap.
I'm trying to make films that are able to authentically express what I feel about faith, but a lot of people in the independent film community seem so hair-triggered to anything that even smells of Christianity. How do I portray my personal faith without getting snickered into a hole of being a right-winger? My goal is not to preach or get people 'converted.' I only want to show a perspective on life that is sorely missed by a great deal of humanity. Isn't that what films are supposed to do? To provide new experiences that you've never had... to realize authentically what goes on in peoples' lives other than your own...
I just watched "Pieces of April." That's probably closer to the Christian faith than many elements of the Passion of the Christ IMO.