Column #21, posted August 2, 2005
Universities: revising what can’t be revised- why it is important to build on-line universities
It isn’t possible to change the way universities are structured in any significant way. Faculties would never allow it, students would object, and people in general would wonder what ever had become of education.
With that in mind, I would like to make ten suggestions about actions that universities should consider taking if they actually want to make people believe that the education of students is their primary mission.
- Any student who applies should be admitted.
- Faculty should not determine the curriculum.
- Courses should not have fixed time lengths and should not be taken in parallel.
- Teaching should occur just in time.
- Courses should be dominated by projects not lectures.
- Curricula should be designed with job-preparation in mind.
- Success in school should not mean winning a competition.
- Graduation should be determined by a method other than total credits accumulated.
- The price of college should be cheap.
- The venue should be serious, i.e. not about or related to parties, drinking, football, fraternities or extracurricular activities.
Since these ideas may be confusing (or infuriating) I will now consider them one at a time.
Any student who applies should be admitted.
Why are there admissions requirements for top universities? Basically there are three arguments for these requirements:
1. There is only so much space.
This argument is sometimes actually true. You can always stuff a few more into a lecture hall, but dorm rooms are expensive to build, and seminars need to be kept small in order to make them work. But, of course, space is not an issue in an on-line curriculum, so surely this reason wouldn’t apply there. So, an on-line curriculum wouldn’t have admissions rules then?
When I worked with Carnegie-Mellon University to build their on-line curriculum in Computer Science, I proposed admitting anyone who applied. This idea was rejected because of argument number 2:
2. If everyone had a degree from an elite school, then it would no longer be an elite school, would it?
Well, maybe not. But why should that matter? A good on-line university might give out thousands of degrees in a given field. The relevant question to ask would be: are the graduates of this program capable of doing something in the real world for which they have been trained by the school?
Ah, so that is the real reason. Colleges don’t typically train anyone to do anything real, so there would be no way to judge. This is why you will never see Yale on-line. Yale would cease to be seen as “Yale” if suddenly there were hundreds of thousands of graduates. There is no way to judge if Yale graduates can do anything in the real world. That isn’t a measure that Yale uses. It was one I introduced in the CMU program I designed but, shall we say, the faculty weren’t quite ready for that new model.
And then there is the most commonly cited argument:
3. It is difficult to teach students who are not well prepared.
I really love this argument. It says in essence, that if it is hard to teach certain students, they shouldn’t be taught. No idea of taking it as a challenge to get students ready to learn whatever it is you think they should know. Let’s just not let them in.
No. Let’s let everyone in. Let’s design schools that have no space issues (one good reason for on-line schools.) Let’s design schools that do not dwell on their elite name, ones that simply prepare people to do stuff. And, if students aren’t ready to learn what is taught at these schools, let’s make sure we have a program that meets them where they are and gets them ready.
I know. Too radical. But, it can be done. “Anyone can go to a high quality university” is an important idea. When space is not an issue it is a doable idea. Succeeding after admission would be up to them. But, in order to make this idea work, students would need to understand that if they can’t do the work they will be left behind. Implementing that idea requires that some of the changes below be implemented as well.
Faculty should not determine the curriculum.
This is, of course, another very radical idea. But this one is not at all obvious. Why does this matter? Faculty are the experts. Wouldn’t they know what students who are studying in their field should learn?
This is indeed an interesting question and one that strikes at the heart of what is wrong with today’s universities. The average student goes to college intending to graduate and get a job in a field relevant to what he studied in school. Not a radical thought really? Seems right no?
A professor who teaches the field that the student has decided to study has a number of problems with this pretty straightforward idea however. The first problem the professor has with this idea is that he cannot relate to it. In general, professors have not actually worked in the real world versions of the disciplines they profess. A computer science professor for example (this was my primary field when I was professor) probably last wrote a computer program when he was a student in school. His specialty in computer science (mine was artificial intelligence) is what he wants to teach. Unfortunately, the average student needs to learn this specialty like he needs a hole in his head. But, the professor really doesn’t care about this. The professor wants to teach what he knows best, what he loves to think about, and what is the least work for him. So he makes up a dozen rationalizations about why his esoteric field is really very important for any computer science major to know.
Bear in mind that there are a lot of professors in any given department in a large university. And, they each want to teach their own specialty. And very few of them have real world experience. So, when all is said and done, the curriculum is a compromise hodge podge of specialty subjects that are surely “very important for every student to know” which when taken as a whole will not even come close to getting a student started in his profession in the real world.
Think this is just true of computer science? I was also a psychology professor. Students in that field typically want to work in psychological services, health, counseling, social work and such. But, the professor’s specialties are again driving what is required. So everyone has to take cognitive psychology (my specialty in psychology) even if it no way will help them counsel people. Well, it might help them. How could it hurt them? This reasoning is a cover for the fact that many psychology departments don’t have any faculty at all in clinical psychology because they do not consider it be an academic subject. There is no idea at all, in most departments, of allowing students to be pre-clinical. They typically refuse to teach such practical subjects -- often because they really don’t know all that much about them.
Developmental psychology, for example, a subject that teaches about how children develop, is often filled with students who want to know about the children they expect to have some day. Will their professors accommodate them? No way. Their professors don’t necessarily know much about child raising. (I HAVE MET THEIR CHILDREN.) What they do know about is how to do research in developmental psychology. They know how to conduct research and feel that they should therefore teach students to conduct research even though their students have no intention whatever of ever doing research in real life.
Wait. It gets worse. Remember those introductory courses you took in psychology? Didn’t they seem dull? A mindless survey of everything anyone ever thought in psychology. And there was no way to get out them if you wanted to study anything else in psychology. Want to know why that was the case? In any research university psychology, the faculty need subjects for experiments. They get them from the intro course (remember the experiments you had to sit through in order to get credit?) So all students are funneled into that course in order to provide fodder to the experimental mill going on in the laboratories next door.
Do you see why you can’t trust faculty to teach anyone who is not preparing to be exactly the kind of professional that they are – namely a researcher in a particular specialty within a given field? Professors are enamored with theories and ideas precisely because that is what they deal with all day. If you want practical real world skills, you won’t learn them from them. But why should this be the case? Why shouldn’t an undergraduate with no intention of becoming a researcher be unable to pursue a more pragmatic education?
This is simple to explain. Practitioners are looked down upon by researchers. Professors at top universities do not want to think they training practitioners. That is just training and they don’t like it. They rationalize the irrelevant education they provide by saying it is about ideas and that they are teaching you to think. This is a wonderful rationalization that allows them to keep on teaching their own specialties and then be able to go quickly back to doing their research. Lesser universities want desperately to be like the big boys, so even when their professors are not themselves researchers they aspire to be like the heroes of their field. So, they teach the same courses they took when they were getting their Ph.D. It doesn’t get any better at universities that do not emphasize research.
In general, students and their real world needs and expectations are ignored by the faculty. Someone other than faculty needs to determine their course of study. Professors have proven that they don’t really care about this issue over and over again.
All this happens at any university that exists today but would not happen at a well-planned on line university. Why not? Because, any new on-line university would ask advice about what to teach from professionals who did not have a vested interest in the answer. In other words, no on-line university would have to employ a permanent faculty, so there would be no one to press their own special needs. In this way, student’s needs could actually be served by professionals whose only interest was making sure that students were well educated in their chosen field.
Courses should not have fixed time lengths and should not be taken in parallel.
The crux of the issue in teaching students is whether teaching them entails making them memorize information in order to pass tests, or whether it entails creating experiences for students from which they can learn through participation.
Since students typically take courses when they are in school, let’s put this issue another way. How long should a course last?
The university’s answer: fourteen weeks (plus or minus a few weeks.)
And how often should a course meet?
The university’s answer: three hours a week (more or less.)
One would be right in assuming then, that a course should, for some reason always be about 42 hours long. I wonder how that number was arrived at and how it happens that all courses are exactly the same length.
Do you think the answer might have anything to do with the needs of students? Or might it be more reasonable to assume that it has to do with the needs of the faculty?
When I was at Northwestern, I was expected to teach one course every two years. This course lasted 12 weeks and met for three hours a week. I was lured to Northwestern, in part by this better deal. At Yale I had to teach one course every year. (And it lasted sixteen weeks!)
Boy. That sure is weird huh?
No. It isn’t. The more important you are the less you teach. Teaching, for professors at the top universities, is considered a burden that one is always trying to get out of. Bad professors (ones that don’t publish or bring in research funds for example) are punished with more teaching.
Now that you know this, and believe me what I am describing is quite normal at top universities, ask yourself why courses are structured the way they are. Whose interests are served by fixed course lengths and minimal course hours per week? You might think that having a student take four or five courses in a semester that are unrelated to each other serves the interests of student breadth and choice. But, its actual purpose is letting professors not let teaching get in the way of more important matters. If courses are only going to meet three hours a week, then students will need to take lots of them to keep occupied. The fact that this sometimes causes students to not be able to focus at all on some of their courses does not bother the faculty (unless theirs is the course being blown off.)
Courses that are structured in this way, do not really allow instruction that is anything other than lecture and test. Designing real experiences for students, ones that allow them to thoroughly investigate something, or build something, or design something, would take more than 42 hours and would require students to focus on only one or two courses at a time. This would, in turn require professors to be available to help students whenever they needed help in pursuing whatever project they were involved in. So, while an intensive course might be good for students by letting them get really involved in something, it would be bad for professors since it would not allow them to consider teaching to be the least important aspect of their job.
And how long should a course take? As long as it takes to learn whatever it is the student is trying to learn how to do. But, how would that work exactly? Students would have to be allowed the freedom to pursue a project in the right time period for them (and for the project). This would mean that professors would have a life that was very unstructured, an unacceptable state of affairs for someone who has more important things on his agenda.
Courses as they exist today probably shouldn’t exist at all. They exist to make life easy for faculty. Real teaching would require real experiences. Designing and monitoring those experiences should be what faculty do. It would be what faculty would do in an on-line university. This kind of apprenticeship-type teaching only happens at the end of Ph.D. programs in today’s universities. Professors get serious about PhD students. Perhaps they should get serious about everyone else.
Teaching should occur just in time.
In an on line, learning by doing, experience-based, learning environment, teaching occurs on an as-needed basis. Need help in what you are doing? Ask for it. Available to help: mentors, other students, and faculty. Teaching in an as-needed environment is not all that difficult really, especially since those faculty who do it do not have to lecture, meet classes, or grade tests. We did this in the on line masters programs we designed at Carnegie Mellon West and it is working just fine. The CMU West model should be the model for on line universities for years to come.
Who should teach? Whoever is capable of mentoring a student through a particular issue. The idea of one teacher one course is a classroom-based idea. In a on line curriculum there can be math mentors, physics mentors, computer mentors, writing mentors and teamwork mentors all available as part of the same course (in a story that involved all the above issues.)
The idea that the experts who teach must be Ph.D.s who are top ranked researchers makes little sense in a project-oriented environment. While building a web site as part of a project on medical information, for example, the best mentor might be a professional web site builder not the medical school staff.
Courses should be dominated by projects not lectures.
Many professors have recognized the value of project-based learning and it is not unusual to find this type of course in a university. One is more likely to encounter such courses in engineering or computer science or journalism. In other words, projects work well in courses where the end result is a student who has learned to actually do something. This should be true of all fields not just ones that are obviously about doing.
But, more important is the placement of those projects in the curriculum. Typically one finds the project course at the end of a curriculum – perhaps in the senior year in college. Why is that?
This again comes down to curriculum committees that have determined that one must know this or that before embarking on any real world experience. They are interested, as I said, in filling seats in introductory courses. If they let people do project courses first, it would not be possible to make the economics of the department work. Project courses are expensive to run. You have to have small student teacher ratio to make them work. The 500-1 ratio that works so well (in terms of money to a department) in a lecture course is replaced by 20-1 or worse, 10-1. The faculty gets paid the same one way or the other, so departments hate this. Never mind that, at least since Plato, scholars have been pointing out that we really only learn by doing. Students know this too which is why they all prefer project course to lecture courses. (That is, those who want to learn prefer them.)
Starting with a project makes a lot more sense in terms of deciding whether you like a field as well. Listening to someone talk about a field tells you much less than actually trying to do work in that field will ever teach you. Universities encourage summer internships for this sort of things. Or, to put this another way, they leave the real teaching to companies that students can get to take them on for free. Unfortunately, the teaching there is hit or miss since the people in that company are unlikely to be teachers or care much about teaching.
Starting with projects – and, to be honest – continuing with projects -- ones that relate one to the other would be nice, works best for students, but the economics of the university prevent that. Once again the on-line university can take care of this issue quiet effectively. The reason: just in time teaching is required for project based learning and just in time teaching works exceedingly well in an on line environment. Teachers can be available on demand in an on line environment and suddenly the numbers start to work.
Curricula should be designed with job-preparation in mind.
“That sounds like a trade school, Roger.” This is what the President of Yale said to be when I suggested something similar be done at Yale while I was on the faculty there. The thing is, Yale (and every top university) is a trade school. The trade all the students are training for is – professor. When you study academic psychology instead of practical psychology, when you study theoretical computer science instead of programming, when you study calculus as part of your economics curriculum which is the only way you can study business because there is no business major – this is getting ready for a job too – its just not the job you actually intended to prepare for.
Let students prepare for one or many jobs that interest them. Stop all the academic pretense. The idea that college produces young scholars is just not really true anymore. This idea reminds me of that phrase they use to describe the semi-professional football players that universities employ. They are called scholar-athletes by the TV announcers who describe their games. I am sure they are often found preparing a treatise on Roman history at half-time.
Success in school should not be a competitive event.
There is someone who works for me who is a graduate of a very respectable academic institution. I often need him to write things, but I keep having to remind myself not ask him because he simply cannot write a coherent English sentence. The other day I asked him how it was possible that he could be a graduate of this esteemed institution and yet not be able to write at all. He responded that he was a math major in college and that he had chosen this field precisely because there would be no papers to write. He knew he could not write. He therefore avoided writing courses. Now this may seem, and of course it is, the exact opposite of what college is supposed to be about. Shouldn’t one focus on what is hard and learn that? How naïve!
This man was on a scholarship. He did not want to do anything to jeopardize the scholarship and that would include taking a course that might result in a bad grade. I remember a friend of my son who majored in psychology because it was full of multiple choice tests and he said he was very good at those tests. What a reason to study something! I remember a French kid in one my French classes when I was in college. When I asked him what he was doing there, he said it was an easy “A”.
As long as education produces winners and losers we will all be losers. As long as school is viewed as a competition, we will have players of the game who are good at the game but learn very little. Here again the on line university, open to everyone, can change the nature of courses from competitive events to exercises where the quality of what is produced is the only measure that matters.
Graduation should be determined by a method other than total credits accumulated.
Who thought up the idea that you are certified as a college graduate simply because you have accumulated a certain number of credits? Doesn’t that seem like a crazy idea on the face of it? Hang around long enough and you are done. Wouldn’t being done have something to do with demonstrated abilities? Shouldn’t you graduate when you have shown that you can do what people in the filed you are studying do? Seems obvious to me.
It isn’t obvious to people who run universities because they don’t really see themselves as teaching anyone to do anything and because, they wouldn’t get to charge four years of tuition that way.
You should graduate when you can demonstrate some abilities that you didn’t have before you started school.
The price should be cheap.
This one is easy. Tuitions in the tens of thousands of dollars limit education to a select few. Those prices come about because of expensive physical facilities and highly paid faculty. An on line university would have neither. The tuitions can be low after the expenses of developing the on line courses have been paid for. Or they can be low at the beginning if the development of the courses is paid for by people who don’t expect to get their money back. Then the expenses are just the operating expenses for mentors and the faculty who interact with students.
The venue should be serious.
When I was in college I was in a fraternity. I loved it. I was also on the football team. I learned to drink, deal with the opposite sex and generally grow up. College was great experience for me. Unfortunately my parents were paying a rather hefty tuition bill for all this. It was unfortunate because I almost never attended class. I found the lecture boring and the subject matter of no interest. I didn’t quit because I was having such a good time and my parents would have been very disappointed. So I graduated on time. My grades were so bad that the person in charge of freshmen at the fraternity asked me how my grades could possibly be so low given that (at least he thought) I was so bright. I didn’t then resolve to improve them. I resolved to party on.
I insisted that my children join fraternities when they went to college. Why, you wonder? Because I learned a great deal in my fraternity about politics, social interactions, figuring out how to lead and influence people and so on. I also learned the majority of what I learned academically in college at my fraternity by hanging around the smarter kids and seeing what they were up to and occasionally helping them out.
What is wrong with this picture?
The social aspect of college is very important for the eighteen year olds who enroll there. Perhaps adults require something else however. Adult education should not be burdened with rites of passage in which undergraduates participate. Learning needs to be divorced from the “college experience.” The on line university is the alternative. Harvard will still be Harvard and it will still be great fun to go there. But, the rest of the world, those that won’t go to Harvard, desperately need an on line university that is open to all, cheap, and capable of providing mentored experiences that reflect the real world ambitions and needs of its students.