Page Suggestions: Bible Facts

What does the BIBLE say about the Positive Influence of the Bible in America


Christianity's Influence in Education

Contrary to popular opinion, Christians are not anti-education. The Bible instructs us to teach our children. The real issue is not eliminating education, but rather seeing that the information being taught is accurate and is not against the Word of God. Most of our well known universities, such as Harvard, Princeton and Yale, were founded by Christians. In fact, the first 126 colleges and universities in the U.S.A. were built for the glory of Jesus and the advancement of the gospel.[2] Christians took the Bible seriously when the Word of God commanded them to teach and train their children in the way of the Lord.

And these words, which I command thee this day, shall be in thine heart: And thou shalt teach them diligently unto thy children, and shalt talk of them when thou sittest in thine house, and when thou walkest by the way, and when thou liest down, and when thou risest up. Deuteronomy 6:6-7

How sad it is now that our laws now forbid Christian students to pray publicly in the classrooms of our schools. Instead, violence stalks the halls of our education centers and children are murdering in the classrooms. Each law that eliminates God and His influence from our society, allows evil to fill its place. Disallowing public prayer in our schools is another law that needs to be reversed.


[2] The Bible and Civilization, D. James Kennedy, Ph.D., Coral Ridge Ministries, Ft. Lauderdale, Florida

Bible Influence Bible Facts