Technology in College — For its own sake
This was an article originally written for the Los Angeles Loyolan, the student-run newspaper at LMU. We thought most HackCollege readers would also find it interesting.
I’m part of the last generation of “Film Production” majors to go through LMU. The School of Film and Television is phasing out the major and already, the sophomores and freshmen major in something different: just plain old “Production.” The point is to encompass both television and film (and maybe even new media) into one major. The class structure is altered now so that students have to take both television and film classes. The idea is that the two workflows are gradually converging, as one-hour television dramas shoot on film and action movies get shot on digital video. That sounds pretty progressive for a film school, doesn’t it?
Unfortunately, changing the name of a major doesn’t do the trick. If you’re going to shoot a junior thesis, AKA PROD 300, you have to use film – actual film, like the stuff we used in still cameras once upon a time. Real film is expensive. That means the bare minimum budget for a junior thesis is about $3,000. Indeed, it’s as goofy a requirement as it sounds. Effectively, when you’re a junior, you can’t shoot a fictional piece on a digital medium in spite of the dawn of iTunes, high-definition television and outstanding digital cameras like the Red One. Granted, 35mm is certainly not obsolete, but it’s noncore for a media person these days.
Our school is chock full of such techno-hypocrisy. Tenured professors who still use AOL meet in 2008 with young people holding iPhones. These bigwigs sit on boards and control policies to prepare students for real life – if our students were graduating 10 years ago. Someone out there saw this happening at LMU and decided to do something about it: hence, the ITA. Every school at Loyola has an Instructional Technology Analyst, whose sole job is to push people toward podcasting and blogging. The ITAs report to IT – not the deans and Jesuits from the past – which gives them enough autonomy to make changes quickly. It’s a step in the right direction, which I applaud, but I still have to watch my peers struggle with celluloid just to meet the requirements for graduation.
Why do people need to learn digital video instead of film? It seems silly to ask this question, but it sheds light on other situations. If the industry is moving towards cameras with hard drives instead of clockwork, then shouldn’t our students be learning about those instead? By the same token, if PR is moving towards MySpace and lectures are moving towards iTunes U then shouldn’t the same follow for our communications and education majors?
In the technology biz, people are critical of one such new phenomenon: the wiki. A wiki is a simple online space for organizing and sharing information. There are other wikis besides Wikipedia – ones for traveling, planning weddings or Philosophy 101 classes. Few would argue that there’s anything inherently bad about a wiki. The resentment is for the “throw a wiki at it” mentality that many leaders trumpet – where letting users put all the information in one place can supposedly solve every problem. Wikis can’t solve every problem and neither can digital video, blogs or Blackboard. One might say: “But isn’t that what these ITAs are doing? Just promoting technology for the sake of it?”
They are, but in the classroom, it’s different. I never thought I’d be doing this, but this is an argument for “technology for the sake of technology.” I really think that educational institutions are an exception to the “throw a Wiki at it” misconception, because throwing a Wiki means a class of students has to learn how to catch one. Students need to know these technologies so that they can use them when they graduate. In other words — if we still wrote our papers on typewriters none of us would get jobs.