Published in The National on December 29, 2000

2000 - the year IT went boom

Year 2000 has been a most interesting year for IT. Tok IT looks at the big things that has made the industry even more exciting this year.

By Daniel Lam
It has
been an eventful year for IT. Just seconds before the clock struck 12 midnight on Dec 31, the eyes of the world were on Australia and New Zealand, the first developed countries to advance into year 2000 AD.

The reason for the vigilance? The simple fact that computers, the older ones that is, can't handle time. Because they see year 1980 as 80 and 1999 as 99, they would see 2000 as year 00, which to their logical processors is simply impossible. 

There were fears that power stations would malfunction, thereby cutting off power to billions worldwide. There were fears that planes would fail to take off, or even drop out of the sky. Nuclear reactors would leak or nuclear silos would unleash their deadly contents.

But life goes on, and better yet, the Y2K bug (or Millennium Bug for some) simply didn't happen ... not in a wide scale anyway. And the IT industry sighed with relief and moved on.

Other things quickly occupied people's minds.

Internet security, B2B vs B2C and the Napster revolution quickly moved computing news off the technology pages and onto the front pages of most news outlets. Processor speeds touched the magical 1Ghz mark. Storage real estate got cheaper. 

Now practically everybody's involved in IT, whether they realise it or not. 


First the accusations
Because the Y2K Bug simply didn't happen the way doomsayers said it should, the many who spent lots of money (billions were spent worldwide) to counter the bug accused the IT experts of cheating them. 

After all, the Y2K business created a new, money-making field for IT experts who came up with Y2K solutions. 

The experts argue that the Y2K Bug didn't bug anyone (much) because everyone was pretty much ready for it. In short, a POSSIBLE problem was largely avoided by prevention. Better than cure any time.

So the hoo-har died out.


Next came denial
The nightmare started just one month after the world took heart that it was not going to end in a mass of Y2K hysteria. A series of denial-of-service (DOS) attacks from unknown hackers swamped the largest sites on the Web and demonstrated just how vulnerable they are. 

These hackers used viruses to do their dirty job (as always), causing infected computers to send off e-mails to targeted web sites. 

The sheer amount of e-mails overwhelmed the targeted servers and promptly crashed them.


Then there was love
In May, the I Love You virus exploited serious vulnerabilities in Microsoft's Outlook client, causing corporations all over the world to shut down their e-mail systems with DOS attacks. 

Over time, more and more corporate hacks were being publicised, and many experts were hinting at the great number that didn't get reported. 


Playing with the big boys now
Then, near the end of the year, hacktivists (hackers with a cause, like Greenpeace, etc) targeted OPEC (the Organisation of Petroleum Exporting Countries) and Israeli government sites.

In October, even software giant Microsoft, got hacked, further demonstrating a new level of hacker sophistication. 

Many experts predict 2001 won't be any better. 

The main culprits in 2000, after the hackers that is, were application and security vendors who left holes (vulnerabilities) in their software, as well as IT managers who left holes in their networks. As a result, services have become a major part of any comprehensive security plan. 


The big get bigger
Mergers being the popular business choice of 2000, Internet heavyweight America Online moved to merge with media powerhouse Time Warner, forming a US$109 billion (K370 billion - more than 100 times the PNG Government budget) behemoth (yes, big words for big boys). 

The announcement of the deal created a lot of news in usually dull January, with speculation that the merger would create the next monopoly. 

Since then however, US competition regulators have scrutinised the pairing into obscurity. The deal was finally approved by the US Federal Trade Commission on Dec 14. 


Do ya (anti)trust Microsoft?
In a series of landmark decisions, US Judge Thomas Penfield Jackson in April found Microsoft guilty of anti-competitive practices, then in June ordered the company to be split in two. 

But since then, the story has fallen off the screen as court proceedings promise to push the case into 2001 and beyond. 

Microsoft itself is conducting business as usual. 


Big brothers don't like MP3s 
The MP3 (MPEG layer 3 ... MPEG stands for Moving Pictures Experts Group) format allowed huge chunks of music to be cut down in size, until small enough to justify using it to play music over the Internet. 

There are also software out there that allows you to "rip" tracks off CDs, converting them into MP3 files.

Then Napster came along. This nifty program allows users to share MP3 files, and created a new acronym: P2P (peer-to-peer computing).

The next thing everyone knew, the multi-billion dollar music industry felt threatened.

Music industry leaders like Sony moved in and Napster Inc was in danger of being shut down over breaches of copyright violation.

Napster Inc has so far avoided a court-ordered shutdown for copyright violation, but such a shutdown seems inevitable. 


Dot.coms going Dot.bombs
It was fun (and very, very enriching) while it lasted. We all knew it was going to happen. 

Just didn't know it would happen so soon or so quickly. 

Sparked by the Microsoft guilty verdict in April, tech stocks started a sharp and unyielding decline. 

Once-flush startups were soon burning through cash just to survive, and planned initial public offerings got put on the back burner. 

Even Amazon.com, the most successful non-adult (that is, non-sex, pornography related products) commercial site thus far, is still doing business with its accounts in the red.


Wireless static
Has a technology ever been so poised to explode and yet remained so far away from reality? Wireless is everywhere, from your cell phones, PDAs (Personal digital assistants) and pagers to your enterprise and instant-messaging applications. 

But much work remains to be done in standardising network protocols, creating useful enterprise applications and syncing the considerable technological advancements of wireless devices with the needs of users. 


B2B rises ... and falls
One year ago, B2B (business-to-business) e-commerce was about to explode. Exchanges and other types of e-marketplaces were supposed to lead us all into a new land of opportunity. 

In February, the Big Three automakers formed Covisint LLC, a marketplace alliance with Commerce One Inc and Oracle Corp, igniting dozens of similar launches. But like their business-to-consumer counterparts, many B2B exchanges, giddy despite their weak foundations with the promise of applying disintermediation to entire industries, quickly collapsed. 

This month's shutdown of two Ventro Corp exchanges, Chemdex and Promedix, warned exchanges that are still around of the need to improve alliances and business models next year.


Banner year for AMD, Intel flags 
Leading chipmaker Intel Corp stumbled this year. Competitor Advance Micro Devices (AMD) Inc on the other hand had lotsa fun on Intel's expense.

It all started way back in the 20th century, in 1998 in fact. Back then, Intel lorded over AMD. Craig Barrett, who succeeded Andy Grove as Intel CEO in March 1998, kicked off a strategy to diversify the company's revenue base. 

AMD, meanwhile, was struggling with financial losses and sporadic inventory and manufacturing problems. Things got so bad analysts questioned whether the company would have to find co-tenants to help pay for its planned fabrication facility in Dresden, Germany. 

In the background, events began to transpire. In the middle of 1998, Intel halted construction on a plant in Fort Worth, Texas, because of the tepid economic outlook in Asia (which suffered a major downturn) and the failure of a tax reform bill in the Texas legislature, according to Intel spokesman Chuck Mulloy. 

Around the same time, Intel engineers in Folsom, California, were discovering that the upcoming 820 chipset for faster Pentium IIIs (codenamed Coppermine) and a then relatively obscure memory technology called Rambus weren't working together as planned. 

Meanwhile, AMD was designing a chip called the K7, which would be described at the Microprocessor Forum in October. 

Compare that to the first half of 2000. 

Intel found itself mired in a dire processor shortage caused in part by a series of delays in 1999 of the 820 chipset and other parts. The scrapping of the Fort Worth plans was also having an effect. 

In addition, Rambus memory had failed, as predicted, to come down in price. 

To spur sales of Coppermine Pentium IIIs, Intel came out with a bridge chip called the Memory Translation Hub (MTH) that was supposed to allow Pentium III PCs to use cheaper standard memory. 

Unfortunately, problems with the MTH cost Intel US$253 million (K845 million) to recall and, more importantly, gave a black eye to its sterling reputation for manufacturing. 

In June, Intel canceled Timna, an inexpensive processor that would have been paired with the MTH. 

Non-Rambus problems cropped up too. The release schedule for Itanium, a 64-bit server chip destined to take on Sun's UltraSparc processors, was pushed back to 2001. The 1.13-MHz Pentium III was recalled. 

The company managed to come out with the Pentium 4, perhaps the bright spot of the year. Still, partly because software has yet to be tweaked for the chip, benchmark tests have shown the chip to provide middling performance improvements. 

By contrast, AMD was seeing tremendous consumer acceptance with Athlon, the public name for the K7. With the new chip, AMD took the speed crown >from Intel, landed contracts with all the major PC makers but Dell Computer, and obtained a new reputation for high-performance technology. 

It also didn't face shortages to the same degree. For the first time since 1995, the company will post an annual profit in 2000. 


Onward to 2001
Things change, and in the IT industry, change is good. Things are going to look better, according to some experts. The Pentium 4 will soon be worth its price. 

There would be no difference between TVs and PCs. The Mac will no longer be superior to the PC. 

The Dot-com collapse will reverse.

And I will mull over how quickly my PC has become nigh-obsolete. Have a pleasant New Year.


 

 

 

Published in The National on May 4, 2001. Reprinted in a Gulf Times supplement on computers and technology in 2002.

Virtual gateway

By Daniel Lam
Ask the
average educated adult in PNG who has had plenty of exposure to computers just what a computer is and you will probably get this answer: a complex tool for making life easier and cheaper, watching movies and surfing the Internet.

Ask an equally exposed child the same question, and the answer would be "game machine" or something like that.

My neighbours' kids have a habit of knocking on my door every weekend and asking for permission to play at my place.

And they don't mean games like Monopoly or Bingo.

They are looking at Half-Life (an excellent 3D first-person shooter), maybe Street Fighter Zero 2 (a one-on-one beat-em-up game ported over from amusement arcades to the home game console, then to the PC).

The difference is that the adult sees the computer as a tool for serious things, like calculating payroll, conducting research, while the child is sees it as "fun" and "magical" even, thanks to the wide range of games.

According to various reports, the child's view is more accurate.

When computer "visionary" Alan Turing conceived of the computer, it was meant to be a close approximation to a machine that could become any other machine - a Universal Turing Machine.

Think about it ... any device that carries out a finite mechanical process can be replicated using a computer.

If you want a word processor, the computer can be configured to become a fancy, virtual typewriter.

The computer's capabilities are even more pronounced in games ... in flight simulators and strategy games the computer is configured to behave like many, many mechanical devices at once - aircraft, clouds, other pilots or military strategists - and the computer can determine the results of complex interactions between these virtual objects.

The computer can thus become a virtual car, a virtual board game, a virtual accountant, etc.
Various IT experts promote the idea that virtual phenomena are just as real as natural ones, even if they are, well, unreal.

This is not argument, that if something looks like a car and it revs like a car, then it must be a car - that's not the point.

After all, when a computer is programmed to behave like another machine, say a word processor, it remains a computer, not a word processor.

It is pretending.

Many machines in the real world are not physical things themselves; they are intangible phenomena derived from the interactions between physical things.

For example, an organisation like the Papua New Guinea Banking Corporation (PNGBC). 
The PNGBC can be counted and labelled. It exhibits a behaviour.

It occupies a building. It is an entity, real because it can interact with other real things.

But the PNGBC is not the people who work for it, or the offices or the legal documents that gave it "life".

The PNGBC is defined by the relationships between these parts, not the parts themselves.

Similarly, the Government of PNG is real, but you can't see or speak to the Government ... you are actually speaking to a representative of the Government (be it an official, minister or even the Prime Minister himself).

Given the right input and the right software, a computer can simulate anything ... from natural occurrences like a storm to man-made disasters like nuclear blasts.

The computer can be used to provide results based on such simulations.

Based on this argument, that real world objects can well be "virtual", that means the computer is more than just a machine with plenty of circuits and wires ... it is a virtual wishing well, capable of becoming anything you want.

Of course, that is not to suggest that the computer has no limits. After all, as the behaviour we wish to elicit becomes more sophisticated, program complexity grows exponentially.

And the more complex the program, the greater the possibility of a total collapse.

Yet us humans, made up of the relationships between millions, perhaps even billions, or working components, function well most times.

Maybe, one day, computers may be able to achieve the same level of sophistication.

Commander Data may not be just a figment of Star Trekkers' imaginations.

Already our perception of the computer is changing, from the mindless automaton that follows precise instructions, to a "cyberspace generator", in which many virtual machines interact to create something bigger, more robust, adaptable and creative.

The old view no longer has a place in today's world of "virtuality", where where computers are linked, where their users are connected, no matter where they are.

We need to look on the computer as much more than a mechanical slave. 

The world is getting smaller all the time.

And my neighbours' kids will still find time to knock on my door, hoping for a few hours' release from boredom (and parents) via the virtual gateway.