An Era Has Ended

Every so often, just for fun, I look back at old posts to see if I still agree with them. Sometimes, they have become even more relevant, as did one of the first “Unconventional” posts from 2014. So, let’s revisit the topic but with a new material, examining how America has become a post-Christian nation, and what it might mean.

America was founded with the concept of religious liberty for all, and with the Judeo-Christian ethic as the foundation for morality in our culture. Frankly, that ethic no longer dominates. No need to list examples. Back in 2014 a friend posted a link from Red Letter Christians by Gary Alan Taylor, suggesting the change may be good for the kingdom of God (that link is at the bottom of this post. I encourage you to read it. I don’t agree with all in it, but it’s worth it).

Taylor described the marriage of the church and state that began with Constantine as “a cosmic revolution…resulting in the alignment of the church with the ruling political regime of the day.” That has now ended in America. Can we truly say as a nation that “In God We Trust?” Realistically, those committed to Christian values are becoming a minority influence. Good or bad. Right or wrong. Like it or not.

Honestly, I'm torn about this. Relying on politics to advance the principles of faith is not only ineffective but cannot be found in the New Testament. Conversely, God's guidelines for behavior benefit all who practice them (not bringing salvation, but a better quality of life). So, I grieve for my society when it chooses behavior that gives it less than the best. Yes, society and individuals have the right to select their behaviors. We followers of Jesus, though, make a strategic mistake when we try to enforce biblical morality on nonChristians. The benefits come most to those who choose them. Let's be gracious participants in the discussion and focus on making faith an attractive and rational option.

Our culture will lose with the loss of the Judeo-Christian ethic, but it's happening and not likely to be reversed anytime soon. My two solaces: it will force nominal Christians to either leave or become authentic, and historically, the church tends to be more effective when it's a persecuted minority. Look at the first three centuries.

Taylor quotes noted theologian Stanley Hauerwas, “Christians would be more relaxed and less compulsive about running the world if we made our peace with our minority situation.” Perhaps these changes can help us focus on pure faith, one not dependent on cultural approval. That sounds a lot like the first century church that turned the world upside down. If we DID cause America to return to behaving like Christians without knowing God, wouldn’t they still miss out on heaven? So what has the kingdom of God gained? Nothing.

So, how do we respond? By fighting a battle that looks unwinnable in transforming the culture, or by winsomely changing the hearts of people through spiritual transformation? By moaning about the changes, or by trying to use them for the kingdom? I think Jesus modeled the latter.

We live in interesting times, don’t we? Let’s creatively, and graciously, be change agents for God, not government and culture.

Kick Starting the Discussion

How many of the recent cultural changes concern you the most? Why? Do you think our culture has reached the tipping point in becoming post Christian? In a post Christian culture, what can followers of Jesus do to best transmit the essential message of Jesus to our world? How can you personally best bring Jesus to this new culture?

http://www.redletterchristians.org/death-dynasty/

 

In God We Trust_LI.jpg