This post is inspired by the MSU IT Conference Keynote by
The education sector in this day and age has really fallen behind
in its traditional role of innovation in the IT field. Lets take a look at some history first:
Back in the 60’s and 70’s there were two camps as far as IT : IBM/Bell and the Universities/Government. Sure, there was some other small groups out there, such as Xerox, but for the most part ‘new’ things really came from those two places. The big businesses were focused on deploying their technologies to other big businesses and selling their mainframes, etc Universities were focused on research, and actually creating these technologies.
The most popular programming languages, C, C++, COBOL, PASCAL, PERL, etc. were all developed in the education sector. The TCP/IP stack, Email, HTTP browsing, etc were also a direct result of the education world. These are all things that essentially shaped where we have gone in the past 20 years. A few more examples are in the hardware that we use — sure today’s PCs are based on IBM’s, Apple’s, et all’s designs, but a lot of the research from computing projects such as the
Atanasoff–Berry Computer (ABC), MIT TX-0, and the MISTIC resulted in advances such as vector processing, clustered memory access, and RAID.
Sure, the big boys of the time had their innovations too, but the universities were for the most part. apart of the leading edge.
So, where is the innovation today? Gerry asked this question to the group and it really struck a chord with me. Why didn’t universities pioneer the Search Engine? The DVD-RW? The latest wireless standard?
Universities are most focused on the things that make them the most money — cures for cancer (yes, this is important), research on how to improve the process of making ethanol into fuel more efficiently, and better ways to finger print a person by their DNA. Bio-medicial stuff. If I were to ask which universities were doing research on, lets say, a new email protocol that wasn’t susceptible to spam, nobody would be raising their hands.
Why are the colleges of the USA forced to purchase anti-spam, anti-virus, email servers, directory servers, web servers, desktops, file servers, network gear, etc. from one of the largest companies in the world? Why are better versions of what we can buy today not developed (or at least experimented with) in house? Sure Linux is out there, but it has become a commercial enterprise.. There is very little left of Linux that was developed without the help of Novell, Corel, Red Hat or IBM (yes, this is a gross overstatement, but the point is to be made).
Universities are currently teaching old, tried-and-true technologies to their students. Students are learning Microsoft Office 2003, on Windows XP. They are being taught Microsoft C#, and Sun Java. Companies are paying for computer labs to give the students skills to do the basic knowledge to to jobs in the workforce. Sure, that’s great. But where is the innovation. Why is it that students no longer come to their first job and say "When I was at xyz-U, I helped with research on a solid-state memory chip that could read and write in less than 1ns!" Today they come to their first job and say "I know how to write a Visual Basic application that can say ‘Hello World’ on the screen."
As our society transitions to a mode where remedial labor is cheap and intelligence is what sets us apart from the other countries that we sub outsource our work to, the colleges and universities need to grab the bull by the horn and find out what made us the place to be in the days of yore. What possessed us to hand-build a super-computer in our labs? What caused us to have our students work on an operating system that was unique just to us? Why have these things gone on the way-side and left the void to be filled by the major corporations?