Search the site

Growing Leaders Blog

on Leading the Next Generation


One Change Universities Should Make

photo credit: Christopher Chan via photopin cc

photo credit: Christopher Chan via photopin cc

Recently, I read an article in The Atlantic, by former professor Marshall Poe, called, “Why Colleges Should Teach Religion to Their Students.” The article was compelling and doesn’t go down the path you think it might. It wasn’t a case against the separation of church and state; instead, it was a thought-provoking piece on how universities seem to be failing at a primary level: teaching young people how to experience a self-sufficient, fulfilling life. It seems like that’s rudimentary for a liberal arts degree. Students may be mastering subjects like math or business or political science, but they quit or graduate still ignorant of how to live a productive life. They figure out how to do “classrooms” and take exams, but not how to navigate the real world.

In my mind, this is just wrong.

It’s intriguing to me how many (if not most) colleges have drifted from their roots. When they were founded, colleges such as Harvard, Yale, Princeton, Duke and others were unique because they required all students to take faith courses. Not just as a vocation for future ministers, but how to incorporate it into daily life. I realize many would argue this violated the separation of church and state—so the classes today would need to vary in type and be offered only as electives. But the truth is, kids today need someone to help them live well.

Former atheist Marshall Poe argues that at its root, genuine “faith” informs a person how to live life. He even believes it’s different than the study of comparative religions, because the need is not academic. It’s about the human soul. Poe indicates schools would need to offer a variety of options, but it would need to be an offering that prevents grads from leaving school empty. In fact, Poe states:

American higher education is the envy of the world. Students come here from all over the globe to study. And American higher education is something we, as citizens, should be very proud of, for we built and fund a large portion of it. It’s really one of our crowning achievements as a nation.

American higher education has, however, one glaring deficiency: it does not teach its undergraduates how to live. It teaches them when the French Revolution was, what the carbon cycle is, and how to solve for X. It does not teach them what to do when they feel confused, alone, and scared. When they break down after a break-up. When they are so depressed they cannot get out of bed. When they drink themselves into unconsciousness every night. When they find themselves living on someone’s couch. When they decide to go off their meds. When they flunk a class or even flunk out of school. When they get fired. When a sibling dies. When they don’t make the team. When they get pregnant. When their divorced parents just won’t stop fighting. When they are too sick to get to the hospital. When they lose their scholarship. When they’ve been arrested for vandalism. When they hate themselves so much that they begin self-mutilating. When they’re thinking about suicide. When they force themselves to throw up after every meal. When they turn to drugs for relief from their pain. When they’ve been assaulted or raped. When their mind is racing and cannot stop. When they wonder about the meaning of it all. When they are terrified by the question “What do I do next?”

Many educators call these “life skills.” Call it whatever you want, but they are not being cultivated in many American homes, nor are they cultivated in many institutions of higher education. Students can graduate from many colleges without critical thinking skills or a healthy worldview. We have become bottom-line enterprises, businesses determined to churn out graduates, but have forgotten what business we are supposed to be in. It isn’t just about numbers like retention rates, or grade point averages, or the ratio of students our campus rejects out of high school. It’s about helping young adults get ready for the real world.

When Horace Mann basically invented our modern public school system over a hundred years ago, he called them “normal” schools, designed to prepare kids for the norms of society. Great idea. I say, let’s do it again.


  1. pmadsen on May 7, 2014 at 6:32 am

    Thanks for the great article, Tim. I agree that religion classes of any type would be a step in the right direction for public education, but students need much more than that to really understand who they are and be successful in life. The Christian school (especially the Christian teacher!) is really the only one that can help students make the connections that are being ignored in secular society namely: connecting work to service and ministry, faith to daily life, morality to absolutes and people to purpose! All these connections are based on a different world view, a God-centered world view instead of a self-centered world view.

  2. jmoore on May 7, 2014 at 11:34 am

    Very interesting article. Is the change in our teaching focus due to an assumption that many of these “life skills” are already being taught at home?

    • Tim Elmore on May 8, 2014 at 10:30 am

      Hey jmoore, thanks for the comment. As I mentioned in the post, I think the main cause is that universities have slowly drifted away from making courses that instill healthy life skills in students a priority. Part of this is definitely due to the assumption you mentioned, but I feel a greater aspect is a negligence in higher academia to realize that students need to balance their major-centered classes with courses (whether they’re religious, spiritual, or simply life-affirming) that both feed their soul and help them find their identities. This does not mean that it’s solely the university’s job to instill life skills in their students – as you mentioned, the hope and assumption is that students leave home and come to college with a healthy foundation to build upon. But as the secondary education system has become more and more integral to the American way of life, it’s imperative that universities begin to balance their focus on educating students with facts with helping students find meaning for their lives.

Leave a Comment

One Change Universities Should Make