computer history

How teaching computer history can aid in programming courses

Posted on Posted in Blogs, Technology





For a beginner, their first programming course can be quite intimidating. First of all, the whole computer system seems like a magical thing that can only be made by geniuses, so there is a general perception that computer programmers must be very smart. Any average student when encounters a fairly complex program in a high-level language like C may lose their confidence.

Secondly, the programming languages are full of jargon which have crept in throughout the development history of computer systems. For example when a student hears that word ‘debug’ for the first time, they cannot quite grasp its meaning.

Thirdly, for a non English speaking nation, its students perceive computers and computer languages as alien entities. Although many languages of the world are now supported in many applications using the UNICODE standard, all popular programming languages are written in English.

Finally, students coming with some basic knowledge of computers and programming language may question why they are taught C/C++ in university courses rather than currently popular languages like python or javascript, which will get them better jobs.

Due to these factors, students in their first programming course may get disheartened and switch to memorizing programs to pass in exams, rather than learning and creating applications themselves.

How teaching computer history can aid in programming courses:

The history of computing is seldom given much importance in programming courses, at least in beginners’ courses. Most of the time a lecture is given so that students can prepare notes for a question that is certain to come in the exam. But proper discussion on the history of computing can solve the above discussed problems.

A look into the history of development of computer systems can make us realize that it is an eventuality for any intelligent civilization. A computer can performs three major tasks: store information, perform arithmetic and logical operations and perform a task automatically if the instructions are given in advance (as programs). Humans have been storing information by various means right from the prehistoric ages, e.g. drawings in caves. Mechanical devices for performing calculations were also used by early civilizations, e.g. the abacus. One of the earliest form of feeding instructions to perform an automated task is the Jacquard loom, invented in 1804, which was used to make designs in clothes.

With these devices already in place, it was only a matter of time until modern computers were developed. Though it seems like a huge leap in technology, it’s actually small increments which has led to the current technology in computers. When students have a proper understanding of the incremental process of development of computers and programming languages, they won’t get intimidated by it. It might even get them interested in indulging in the further development of computers as a career option.

Some terminology used in computer science can be very confusing to beginners. For example lets take ‘debug’. The word by itself makes no sense, but hearing its usage in class lectures a student can associate it with some form of testing a program. But debugging and testing are not the same thing. To grasp the true meaning of the word we have to go back to the time when vacuum tubes were used in place of transistors. These tubes emitted heat and light while operating and so they attracted insects (a.k.a bugs). These bugs shorted out the circuits and caused failures. So over the years, any cause of failure in code is called a bug and finding/correcting it is called debugging.

For non English speaking people, it may seem that computers can only be operated by using English to communicate with the hardware. But in reality computer doesn’t care about human languages, it just understands two states of an electrical circuit, ON and OFF, represented using 0 and 1. Early computer programs were actually written in 0’s and 1’s, either using punched cards or manually flipping switches to feed the instructions. Imagine, Indians could have also created these early computers if we were advanced enough in electrical and switching technology. Even now if one maps hex codes using the first six Hindi/Assamese alphabets instead of A,B,C,D,E and F and works their way up through Assembly language and finally high level languages, we could have a complete ‘Desi’ computer. Moreover the binary-system based computers is not the only kind of possible computer variants. People had tried decimal based computers, though that didn’t succeed as much.

Finally, some of the students may also question the logic of teaching C/C++ in this modern era when the industry is dominated by newer languages like python and javascript. But they don’t realize that to develop a proficiency in modern languages, you’ll have to have a deep understand of the fundamental programming concepts which were developed along with the earliest of languages and are borrowed by subsequent languages.

 

How to incorporate history into the programming course:

One should certainly not teach computer history as a series of lectures before diving into practical programming. The better alternative is to refer to the history of development of programming concepts, while teaching that concept. For example while explaining what a compiler is, the teacher can briefly discuss the development of programming languages from machine language to assembly language to compiled high level languages.

I think teaching the history of development of computers will greatly increase the students’ interest and help them understand the programming concepts better.

 

 

If you liked this article please comment and show your support and interest so that I’ll be motivated to continue this effort. Like our facebook page if you haven’t already.

Comments

comments

2 thoughts on “How teaching computer history can aid in programming courses

Comments are closed.