What is the biblical worldview on our culture? Are things getting better? Is humanity improving the world around it? The Bible is really clear on this one: our culture is dark, and it is getting darker. That is what the Bible teaches. It is not going to get better; it is going to get worse. Despite the fact that humanity has increased in scientific, medical, historical, educational, psychological, and technological knowledge to an astounding degree, we have not in any way, shape, or form changed our own basic nature. And we have not improved society.
Our confidence has increased, but our peace of mind has diminished. Our accomplishments have increased, but our sense of purpose and meaning have all but disappeared. Instead of improving the moral and spiritual quality of our lives, our discoveries and accomplishments have simply provided new ways to show ourselves for what we really are: depraved, sinful, and wicked.
Modern man has simply discovered new ways to corrupt and destroy himself. We go from war to greater war, from immorality to greater immorality, from perversion to greater perversion. The spiral is downward, not upward.
Some Christians try to isolate themselves from the world around them. But that is virtually impossible. You may remove yourself and your children from the culture, or at least attempt to, but know this: your culture will find you.
Withdrawing to a Christian subculture is not what we are supposed to do. Jesus prayed this for us: "I'm not asking you to take them out of the world, but to keep them safe from the evil one" (John ). The objective of believers is not to isolate, but to infiltrate; not to evade, but to invade. We are to impact our culture without being compromised by it.