Saturday, March 04, 2006

Something Old Faster... Why?

One of the most amazing things about the time in which we live is that information is so freely available. Only recently has it been possible to consult your favorite search engine and have relevant information about whatever topic you choose at your fingertips in a matter of 0.55 seconds. By and large this is a great thing for humanity, as it has allowed collaboration that would have been impossible before. Though, could it be possible that this great influx of information has also had a chilling effect upon the way that we do our learning and thinking? Before the word processor with the ubiquitous spell checker, one had to know how to spell. Before calculators were cheap and powerful, the young mathematician had no choice but to learn basic facts. The researcher would often commit many facts to memory so that they would be accessible and ready when they were needed, yet today, rather than memorizing students are often told that it is more important to know where to find the information than to have it readily accessible in their minds.

I recently attended a lecture at school where some of the most basic things I knew to be true about computing were challenged. The lecturer made a claim that though Moore’s law has promised us faster and more efficient processors, that perhaps our ideas of computing, and software engineering really haven’t changed as drastically as he had hoped that they would. Should universities granting degrees in “computer science” be teaching algorithms that were invented in the seventies, and whatever programming language that happens to be en vogue for the day, or should they be more focused on creating thinkers that could discover something radically different than those things that we know today?

I am not convinced that it is a black or white issue, but it is an idea that I have given some thought in recent days. The lecture questioned the idea of software engineering and computer science, from the side that generally software engineers don’t mathematically model/prove that their program will do what it is expected to, whereas all the other fields of engineering are expected to do this analysis before they begin building. The whole thing makes me wonder about the tools that we currently use to build our programs, and what could be done to make software engineering more exact. A question asked several times was “Does your idea scale by a factor of 8 to 10 times and still work?”

If we are to ask questions like that about the systems that we create, certainly we need a more powerful notion of programming, and the question remains, are today’s thinkers being prepared to solve these problems, or are they being so inculcated with the paradigm of the day that there is no room for creative thought?

I think that a lot of that depends on the mind of the individual, though there is heavy influence in the training that the individual receives. In a discussion between classes this week, a few friends were discussing how a CS program could be made more rigorous, specifically the idea of student’s being required to take a class of physics dealing with electricity and magnetism, and math courses through partial differential equations, and possibly some analysis. But more importantly than those “background classes”, the curriculum in the program should be geared toward creating programmers that are thinkers capable of thinking outside of the current paradigm, in addition to being great coders.
So in this age of information, are we getting soft, and rather than coming up with something new, are we just doing something old faster?

No comments: