womenborntranssexual.com
The death of Christianity in the U.S.
From The Baptist News: Miguel De La Torre November 13, 2017 Christianity has died in the hands of Evangelicals. Evangelicalism ceased being a religious faith tradition following Jesus’ teachings c…