Take Back the Culture?
When Christians speak of "taking back the culture" are they (generally) speaking to "taking over" the government and moving the country towards a theocratic society?
or
Do they mean to speak of proclaiming the gospel and facilitating individuals' changed hearts (and lives, choices, really) by the redemption offered in the gospel?
Because these are two very different and divergent points... be careful out there . . .
|