Are Churches Secularizing America?
This article considers why the gospel is absent from much of American Christianity. It explains that many churches in America have embraced moralism, pop-psychology, and therapeutic deism.
Source: Westminster Seminary California, 2008. 7 pages.
Add new comment