Please wait while the page loads ... you will be taken to the article selected ASAP. Thanks!

Published in The National on May 5, 2000

Information Technology ... the way ahead

Welcome to the first issue of the Tok IT Page. IT, which is short for Information Technology, has been with us for many years in various guises.

IT is simply the modern way or method of dealing with information, whether it is to store, transfer, share, create or manipulate. Information today comes in three main formats ... text, graphics and sound. 

When we talk about IT, we also touch upon the tool that made it all possible ... the computer. The first computer may be said to be the abacus ... a rack with sliding beads and used to perform calculations fast (faster, in fact, than someone using a modern electronic calculator). 

And that was some 5,000 years ago in Asia. We will talk about the history of computers in the next issue of the
Tok IT Page

For now it is sufficient to note how the computer has made inroads in the lives of people. In Papua New Guinea, though not as saturated in technology as many of her neighbours, you don't have to look far to find something that has been made possible or available thanks to computers. You may not understand computers, but you are doubtless affected by it. 

Which is one good reason why we have the
Tok IT Page. We aim to inform you, the reader, about Man's best invention. The world is moving very fast in this age of Information Technology. So as not to be left behind, PNG must seize all opportunities. 

Let's start talking. 


Daniel Lam, Tok IT Editor

By Daniel Lam
You see
it everywhere. For those who have had little exposure to this pervasive "entity", it is nothing short of marvellous, the things it can do. The counter clerk makes a few taps on a square board with buttons and voila! things happen. It tallies your groceries bill. It allows us to withdraw cash after office hours.

It has been described as Man's second best (if not the best) invention after the wheel. With this amazing "tool", for in essence that is what it is, Man has been able to perform feats long deemed impossible.

Can't solve complex calculations? This tool can handle it. At present, this tool can handle close to one billion of such calculations at a time.

We are talking about the computer, of course. Surprisingly, it can be difficult to describe what exactly a computer is.

One particular textbook defines a computer as an electronic device, operating under the instructions stored in its own memory unit, that can accept data (input), process data arithmetically and logically, produce results (output) from the processing, and store the results for future use.

Most computers also include the capability to communicate by sending and receiving data to other computers and to connect to the Internet (more on this in a later issue of the Tok IT Page). Often the term computer or computer system is used to describe a collection of devices that can function together to process data.

At Tok IT, we see the computer as an electronic tool that helps users make decisions faster and better.

A computer is made up two parts: hardware and software. Hardware is physical, and software is not.


Hardware
There are many components that make up the hardware. Although different computers can have a variety of different components, they all come under three categories:

Input Devices
Input devices allow us to interact with the computer and enter (issue) commands into it. For most people who use personal computers, the two main important input devices are a keyboard and a mouse.

The keyboard generally has buttons (known as keys) corresponding to alphabets and numbers, allowing us to type in text and numbers, as well as other keys with specific functions.

A mouse is a hand-held device that allows us to move a pointer on the screen. When the pointer is placed at the proper position on the screen (i.e. pointing to a button, a slider, or another image on the screen), we click on one of the mouse buttons to select that item and thereby issue commands to the computer.

Sometimes computers have other input devices such as card readers, scanners, and voice recognition devices. Still others may have gamepads and joysticks.

All of these provide some way for a computer user to provide commands to the computer.


System Unit
This term refers to a box that contains the electronic circuits that carry out the data processing. The electronic circuits of a computer are connected to a main circuit board (often called a motherboard or mainboard).

The motherboard contains the Central Processing Unit (CPU) that executes instructions and controls the order in which the computer carries out its operations, and an arithmetic/logic unit (ALU) that performs mathematical operations and evaluates logical expressions.

The CPU is often called the processor. The systems unit also contains memory chips known as random access memory (RAM) chips, which temporarily store data and operating instructions when the computer is turned on.

This storage is "temporary" because is it is lost when the computer is turned off. The system unit also contains a wide variety of electrical components that connect the input and output devices to the other units of the computer.


Output Devices
Just as you need input devices to provide commands and data to a computer, we need output devices to receive information back from it. Examples of output devices include the monitor and a printer. Today many computers also come with speakers that provide sound output.

Other Things
Computers are also capable of storing large amounts of data on a more permanent basis than that in RAM chips. They are linked to the motherboard and are called secondary storage units. The information stored in these units is not erased when the computer is switched off.

Examples of these units include floppy disk drives, hard disk drives and the CD-ROM drives.

Today these devices are typically housed inside the same case as the systems unit, but are really separate components. The processor takes a longer time to access information stored on these devices compared to that stored in RAM, but at least the information is permanent.

There are other components, which make up the modern computer.

Communication devices are one example. These devices include modems (that allow us to connect our computer to other computers via a telephone line), and network interface cards (that allows us to connect our computer to other computers via network cabling). A network basically refers to a group of computers that are connected together.


Software
Having all the hardware is useless without the software. Software is the link between the user and the hardware. With the appropriate software the user can perform amazing feats with the computer. Without software, the computer becomes little more than a very expensive paperweight.

For example, this article was written using software. Sometimes software is called a computer program.


What computers do
Any type of computer can be thought of as capable of performing four basic types of operations: input, processing, output and storage. First, data (numbers, words, pictures, sounds, symbols etc) is input into a computer.

Then the computer processes (manipulates, sorts, organises, performs arithmetic operations etc) the data. Finally, the computer outputs the results of its processing.

Often, the computer also stores both the data and the results of the processing (which are often referred to as information).

Computers are categorised in many ways, primarily based upon their size and complexity.
Among the common textbook categories of computers are personal computers, servers, minicomputers, mainframes, and supercomputers.

Personal computers (PC), which were initially called microcomputers, include the subtypes of palmtop, notebook, desktop, workstation, and others. Basically, a personal computer is a small unit designed to be used by one person at a time.

Today the term "PC" is synonymous with IBM computers or compatibles (meaning similar in hardware and software). Tok IT will delve into this in a later issue. In any case, a personal computer is generally any computer designed for one user.

A server generally refers to a computer that is designed to support a network of computers. Typically, a server works as the central computer as a part of one or more networks.
It may have more than one processor, includes high-speed communication facilities, and possesses large memory and storage capabilities. Usually a number of personal computers will interact with the server through a network.

A minicomputer is a much bigger computer to which multiple users can be connected at the same time via terminals. Their commands usually go right to the minicomputer, rather than to a unit sitting in front of them. Users typically share the same computer through a timesharing arrangement that gives each user a few milliseconds of time before proceeding to the next user. The time sharing operation is so fast that each user has the illusion of being the sole user.

Main frame computers are vastly larger system that can handle hundreds of users simultaneously.

Super computers are the most powerful and expensive computers. They can process billions of operations per second, and are used in engineering design, military and space programs, and weather forecasting.

    


 

 

 

Published in The National on May 12, 2000

From beads to silicon

Nothing reflects modern living better than the computer. They are everywhere. The first computers may have been meant for performing complex calculations, but now they are used for leisure and communication as well.

It may be an exercise in futility to figure out just how much the computer has changed and influenced our lives, but it wouldn't hurt to know just how this marvel of mankind's genius came to be.

Last week we introduced the computer. To fully understand and appreciate it, however, we must understand its evolution. Maybe then we can see where all this technology is heading.

By Daniel Lam
Thousands
of years ago (probably around 1,000 BC) in Asia mankind came up with a tool to help him perform simple mathematics. It was simple ... a number of sliding beads arranged on a rack and the system for calculations was equally so. It was probably used by traders to keep track of their business. It was the abacus.

It was many, many centuries later that the next significant advance in computing technology took place.

In 1642, Blaise Pascal, the 18-year-old son of a French tax collector (somewhat like a representative of PNG's Internal Revenue Commission), invented what he called a "numerical wheel calculator". It was also called a Pascaline. It was meant as an aid to his father's work. It was limited to additions - no subtractions, multiplications or divisions. This brass rectangular box used eight dials to add up sums up to eight figures long (from 1 to 99,999,999).

Young Pascal's invention used a base of 10 to accomplish this. For example, if the "ones" dial moves 10 notches or one complete "revolution", it moves the next dial (the "tens") one place. When the "tens" dial moves one "revolution", the dial representing the "hundreds" moved one notch, and so on.

A little over half a century later, in 1694, German mathematician and philosopher Gottfried Wilhelm von Leibniz improved on the Pascaline, allowing the machine to multiply as well. Mr Von Leibniz's mechanical multiplier, like the Pascaline, worked by a system of gears and dials.

It wasn't until nearly two centuries later, in 1820, that mechanical calculators gained widespread use.

Charles Xavier Thomas de Colmar, a Frenchman, invented a machine that could perform the four basic mathematical functions. Mr De Colmar's mechanical calculator, called the "arithometer", presented a more practical approach to computing because it could add, subtract, multiply and divide.

With its enhanced versatility, the arithometer was widely used up until World War I. Even so, the computer as we know it today had its beginnings only in the 19th century.

Mathematics professor Charles Babbage, who was examining calculations for the Royal Astronomical Society, was absolutely frustrated with the number of errors he had found. He was said to have declared: "I wish to God these calculations had been performed by steam!"

By 1812, Prof Babbage found a "natural" harmony between machines and mathematics - he realised that machines perform tasks over and over again without mistake while mathematics often required repetition of steps.

The solution, therefore, lay in applying the machine to meet the needs of mathematics. His first attempt at solving this problem took place in 1822 when he proposed a machine, called a Difference Engine, to perform differential equations. Looking and somewhat powered like a locomotive (a train powered by steam), the machine would have a stored program and could perform calculations and print the results automatically.

Ten years later, Prof Babbage began work on the first general-purpose computer, called the Analytical Engine. The engine however existed mainly on paper and the minds of those who understood it. It seems primitive by today's standards, but it outlined the basic elements of a computer and was a breakthrough concept. Incorporating over 50,000 components, the basic design of the Analytical Engine included input devices in the form of perforated cards (like punch cards) containing operating instructions and a "store" for up to 1,000 numbers that are up to 50 decimal digits long.

It is worth noting that Prof Babbage borrowed the idea of the punch cards from the Jacquard loom. The loom, produced in 1820, was named after its inventor, Joseph-Marie Jacquard. Mr Jacquard used punched boards to control the patterns to be woven.

In 1889 American inventor Herman Hollerith applied the Jacquard loom concept to computing. His first task was to find a faster, more efficient and accurate way to compute the US census. The previous census in 1880 took seven years to count. With a expanding population, the census bureau feared it would take 10 years for its next census.

Mr Hollerith used cards to store data information, which he fed into a machine that compiled the results mechanically. Instead of 10 years, census takers managed to compile their results in six weeks with Mr Hollerith's machine.

Then he brought his punch card reader into the business world, founding the Tabulating Machine Company in 1896.
In 1924 after a series of mergers, the Tabulating Machine Company became International Business Machines (IBM).

In the ensuing years, several engineers made other significant advances.

Vannevar Bush developed a calculator for solving differential equations in 1931. The machine could solve complex differential equations that had long left scientists and mathematicians baffled. The machine was cumbersome because hundreds of gears and shafts were required to represent numbers and their various relationships to each other.

To eliminate this bulkiness, John V. Atanasoff, a professor at Iowa State College (now called Iowa State University) and his graduate student, Clifford Berry, envisioned an all-electronic computer that applied Boolean algebra to computer circuitry. By extending this concept to electronic circuits in the form of on or off, Prof Atanasoff and Mr Berry had developed the first all-electronic computer by 1940.

Unfortunately, their project lost its funding and their work was overshadowed by similar developments from other scientists.

    


 

 

 

Published in The National on May 12, 2000

Five generations of computers

By Daniel Lam
So the
computer had its beginnings in the humble abacus. And Blaise Pascal's ingenuity allowed future inventors and scientists to come up with something that has changed the way people live, work and have fun.

However, computers really started taking shape during World War II. Without going into detail about the war of global proportions, it is sufficient to note it was the time when the saying "Necessity is the mother of invention" really held true.

In the race to outdo each other, the Allied and Axis forces came up with devices that fast-tracked the evolution of computers.

The evolution of computers can be divided into five generations:


First Generation (1945-1959)
With the onset of World War II, governments sought to develop computers to exploit their potential strategic importance. This increased funding for computer development projects hastened technical progress.

By 1941 German engineer Konrad Zuse had developed a computer, known as the Z3, to design airplanes and missiles. The Allied forces, however, made greater strides in developing powerful computers.

In 1943, the British completed a secret code-breaking computer called the Colossus to decode German messages.

American efforts produced a broader achievement. One of them was the Electronic Numerical Integrator and Computer (ENIAC), produced by a partnership between the US government and the University of Pennsylvania.

Consisting of some 18,000 vacuum tubes, 70,000 resistors and five million soldered joints, the computer was such a massive piece of machinery that it consumed 160kW of electrical power, enough energy to dim the lights in an entire section of, say, the National Capital District.

Developed by John Presper Eckert and John W. Mauchly, ENIAC, unlike the Colossus, was a general-purpose computer that computed at speeds 1,000 times faster.

In the mid-1940's John von Neumann joined the University of Pennsylvania team, initiating concepts in computer design that remained central to computer engineering for the next 40 years.

Mr Von Neumann designed the Electronic Discrete Variable Automatic Computer (EDVAC) in 1945 with a memory to hold both a stored program as well as data.

This "stored memory" technique as well as the "conditional control transfer", that allowed the computer to be stopped at any point and then resumed, allowed for greater versatility in computer programming.

The key element to the Von Neumann architecture was the central processing unit, which allowed all computer functions to be coordinated through a single source.

In 1951, the UNIVAC I (Universal Automatic Computer), built by Remington Rand, became one of the first commercially available computers to take advantage of these advances.

First generation computers were characterised by the fact that operating instructions were made-to-order for the specific task for which the computer was to be used.

Each computer had a different binary-coded program called a machine language that told it how to operate.
This made the computer difficult to program and limited its versatility and speed. Other features of first generation computers were the use of vacuum tubes (responsible for their breathtaking size) and magnetic drums for data storage.


Generation II (1956-1963)
By 1948, the invention of the transistor greatly changed the computer's development. The transistor replaced the large vacuum tube in televisions, radios and computers. As a result, the size of electronic machinery has been shrinking ever since.

The transistor was at work in the computer by 1956. Transistors led to second generation computers that were smaller, faster, more reliable and more energy-efficient than their predecessors.

The first large-scale machines to take advantage of this transistor technology were early supercomputers, Stretch by IBM and LARC by Sperry-Rand.

These computers, both developed for atomic energy laboratories, were capable of handling an enormous amount of data, a capability much in demand by atomic scientists.

The machines were very expensive, however, and tended to be too powerful for the business sector's computing needs.

Second generation computers replaced machine language with assembly language, allowing abbreviated programming codes to replace long, difficult binary codes.

Throughout the early 1960's, there were a number of commercially successful second generation computers used in business, universities, and government from companies such as Burroughs, Control Data, Honeywell, IBM, Sperry-Rand, and others.

These second generation computers also contained all the components we associate with the modern day computer: printers, tape storage, disk storage, memory, operating systems, and stored programs.

More sophisticated high-level languages such as COBOL and FORTRAN came into common use during this time, and have expanded to the current day.

These languages replaced cryptic binary machine code with words, sentences, and mathematical formulas, making it much easier to program a computer.

New types of careers (programmer, analyst, and computer systems expert) and the entire software industry began with second generation computers.


Generation III (1964-1971)
Though transistors were clearly a major improvement over the vacuum tube, they still generated a great deal of heat, which damaged the computer's sensitive internal parts.

The quartz rock was the solution. Jack Kilby, an engineer with Texas Instruments, developed the integrated circuit (IC) in 1958.

The IC combined three electronic components onto a small silicon disc, which was made from quartz. Scientists later managed to fit even more on a single chip, called a semiconductor.

As a result, computers became ever smaller as more components were squeezed onto the chip.


Generation IV (1971-Present)
After the integrated circuits, things could only get smaller.

The development of large-scale integration (LSI) allowed for hundreds of components to be fit onto one chip.

By the 1980's, very large scale integration (VLSI) squeezed hundreds of thousands of components onto a chip.

Ultra-large scale integration (ULSI) increased that number into the millions.
The ability to fit so much onto an area about half the size of a one toea coin helped diminish the size and price of computers.

It also increased their power, efficiency and reliability.

The Intel 4004 chip, developed in 1971, took the integrated circuit one step further by locating all the components of a computer (central processing unit, memory, and input and output controls) on a minuscule chip.

Whereas previously the integrated circuit had had to be manufactured to fit a special purpose, now one microprocessor could be manufactured and then programmed to meet any number of demands.

Soon everyday household items such as microwave ovens, television sets and automobiles with electronic fuel injection incorporated microprocessors.

Such condensed power allowed everyday people to harness a computer's power. They were no longer developed exclusively for large business or government contracts.

By the mid-1970's, computer manufacturers sought to bring computers to general consumers.

These minicomputers came complete with user-friendly software packages that offered even non-technical users an array of applications, most popularly word processing and spreadsheet programs.

Pioneers in this field were Commodore, Radio Shack and Apple Computers. In the early 1980's, arcade video games such as Pac Man and home video game systems such as the Atari 2600 ignited consumer interest for more sophisticated, programmable home computers.

In 1981, IBM introduced its personal computer (PC) for use in the home, office and schools.

The 1980's saw an expansion in computer use in all three arenas as clones of the IBM PC made the personal computer even more affordable.

The number of personal computers in use more than doubled from an estimated two million in 1981 to over five million in 1982.

Ten years later, 65 million PCs were being used. Computers continued their trend toward a smaller size, working their way down from desktop to laptop computers (which could fit inside a briefcase) to palmtop (able to fit inside a breast pocket).

In direct competition with IBM's PC was Apple's Macintosh line, introduced in 1984.

Notable for its user-friendly design, the Macintosh offered an operating system that allowed users to move screen icons instead of typing instructions. Users controlled the screen cursor using a mouse, a device that mimicked the movement of one's hand on the computer screen. This was known as a Graphical User Interface (GUI). It was several years later before IBM PC users were treated with the same.

As computers became more widespread in the workplace, new ways to harness their potential were developed. As smaller computers became more powerful, they could be linked together, or networked, to share memory space, software, information and communicate with each other.


Generation V (Present and Beyond)
Computers today have reached a level previously not thought possible. In fact, the reason why the Year 2000 Bug (Y2k, also known as the Millennium Bug) was a major worry was because computer programmers did not take it into account.

Ten years ago, CPUs running at a clock speed of 33Mhz were top-of-the-line. Just last month giant computer chipmakers Intel and AMD unveiled Pentium III and K7 Athlon CPUs clocking at speeds in excess of one gigahertz (that's 1,000Mhz!).

Considering just how much modern-day computers are capable of, some visionaries envision a day when computers can think.

Thinking computers, or artificial intelligence (AI), have been the subject of many books, and of course, cinema (think The Matrix, Star Trek or Arthur C. Clark's 2001: A Space Odyssey).

AI is still very much far off in reality. Many of the functions of AI in the fictional world are very difficult, if not impossible, to achieve.

Computers today can accept words, whether written or spoken. They can also imitate human reasoning.

But human understanding, which is a very important component, relies very much on context and meaning.

Certain modern-day computers already exhibit fifth generation attributes. They can be used to predict, say, weather patterns. Or help doctors diagnose their patients' ailments.

But they remain tools. And that is what computers really are.

Sources: Mostly from the Internet, computer magazines and the Encyclopaedia Britannica.